Mike Carson
- Published in print:
- 2007
- Published Online:
- September 2007
- ISBN:
- 9780198520979
- eISBN:
- 9780191706295
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198520979.003.0013
- Subject:
- Biology, Biochemistry / Molecular Biology
The Human Genome Project went three-dimensional in late 2000. ‘Structural genomics’ efforts will determine the structures of thousands of new proteins over the few years. These initiatives seek to ...
More
The Human Genome Project went three-dimensional in late 2000. ‘Structural genomics’ efforts will determine the structures of thousands of new proteins over the few years. These initiatives seek to streamline and automate every experimental and computational aspect of the structural determination pipeline, with most of the steps involved covered in previous chapters of this volume. At the end of the pipeline, an atomic model is built and iteratively refined to best fit the observed data. The final atomic model, after careful analysis, is deposited in the Protein Data Bank (PDB). About 25,000 unique protein sequences are currently in the PDB. High-throughput and conventional methods will dramatically increase this number and it is crucial that these new structures be of the highest quality. This chapter addresses software systems to interactively fit molecular models to electron density maps and to analyse the resulting models. It is heavily biased toward proteins, but the programs can also build nucleic acid models. The chapter begins with a brief review of molecular modelling and graphics. It then discusses the best current and freely available programs with respect to their performance on common tasks. Finally, some views on the future of such software are given.Less
The Human Genome Project went three-dimensional in late 2000. ‘Structural genomics’ efforts will determine the structures of thousands of new proteins over the few years. These initiatives seek to streamline and automate every experimental and computational aspect of the structural determination pipeline, with most of the steps involved covered in previous chapters of this volume. At the end of the pipeline, an atomic model is built and iteratively refined to best fit the observed data. The final atomic model, after careful analysis, is deposited in the Protein Data Bank (PDB). About 25,000 unique protein sequences are currently in the PDB. High-throughput and conventional methods will dramatically increase this number and it is crucial that these new structures be of the highest quality. This chapter addresses software systems to interactively fit molecular models to electron density maps and to analyse the resulting models. It is heavily biased toward proteins, but the programs can also build nucleic acid models. The chapter begins with a brief review of molecular modelling and graphics. It then discusses the best current and freely available programs with respect to their performance on common tasks. Finally, some views on the future of such software are given.
Angelo Gavezzotti
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780198570806
- eISBN:
- 9780191718779
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198570806.001.0001
- Subject:
- Physics, Atomic, Laser, and Optical Physics
Intermolecular interactions stem from the electric properties of atoms. Being the cause of molecular aggregation, intermolecular forces are at the roots of chemistry and are the fabric of the world. ...
More
Intermolecular interactions stem from the electric properties of atoms. Being the cause of molecular aggregation, intermolecular forces are at the roots of chemistry and are the fabric of the world. They are responsible for the structure and properties of all condensed bodies — the human body, the food we eat, the clothes we wear, the drugs we take, the paper on which this book is printed. In the last forty years or so, theoretical and experimental research in this area has struggled to establish correlations between the structure of the constituent molecules, the structure of the resulting condensed phase, and the observable properties of any material. As in all scientific enterprise, the steps to follow are analysis, classification, and prediction, while the final goal is control; which in this case means the deliberate design of materials with specified properties. This last step requires a synthesis and substantial command of the three preceding steps. This book provides a brief but accurate summary of all the basic ideas, theories, methods, and conspicuous results of structure analysis and molecular modelling of the condensed phases of organic compounds: quantum chemistry, the intermolecular potential, force field and molecular dynamics methods, structural correlation, and thermodynamics. The book also exposes the present status of studies in the analysis, categorisation, prediction, and control, at a molecular level, of intermolecular interactions in liquids, solutions, mesophases, and crystals. The main focus here is on the links between energies, structures, and chemical or physical properties.Less
Intermolecular interactions stem from the electric properties of atoms. Being the cause of molecular aggregation, intermolecular forces are at the roots of chemistry and are the fabric of the world. They are responsible for the structure and properties of all condensed bodies — the human body, the food we eat, the clothes we wear, the drugs we take, the paper on which this book is printed. In the last forty years or so, theoretical and experimental research in this area has struggled to establish correlations between the structure of the constituent molecules, the structure of the resulting condensed phase, and the observable properties of any material. As in all scientific enterprise, the steps to follow are analysis, classification, and prediction, while the final goal is control; which in this case means the deliberate design of materials with specified properties. This last step requires a synthesis and substantial command of the three preceding steps. This book provides a brief but accurate summary of all the basic ideas, theories, methods, and conspicuous results of structure analysis and molecular modelling of the condensed phases of organic compounds: quantum chemistry, the intermolecular potential, force field and molecular dynamics methods, structural correlation, and thermodynamics. The book also exposes the present status of studies in the analysis, categorisation, prediction, and control, at a molecular level, of intermolecular interactions in liquids, solutions, mesophases, and crystals. The main focus here is on the links between energies, structures, and chemical or physical properties.
D. E. JANE, H. J. OLVERMAN, and J. C. WATKINS
- Published in print:
- 1995
- Published Online:
- March 2012
- ISBN:
- 9780192625021
- eISBN:
- 9780191724701
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780192625021.003.0002
- Subject:
- Neuroscience, Molecular and Cellular Systems
A wealth of information has now been generated, especially from radioligand binding studies, regarding the structural requirements for agonist and antagonist binding to the glutamate recognition site ...
More
A wealth of information has now been generated, especially from radioligand binding studies, regarding the structural requirements for agonist and antagonist binding to the glutamate recognition site on the NMDA receptor. In combination with a number of molecular modelling studies this has been used to elucidate the three-dimensional topology of the receptor site and thus provide a basis for a rational design of new pharmacological agents with agonist or antagonist action at these sites. This chapter outlines the required structural features for agonist and antagonist binding. It then discusses the various molecular modelling studies which have recently been carried out.Less
A wealth of information has now been generated, especially from radioligand binding studies, regarding the structural requirements for agonist and antagonist binding to the glutamate recognition site on the NMDA receptor. In combination with a number of molecular modelling studies this has been used to elucidate the three-dimensional topology of the receptor site and thus provide a basis for a rational design of new pharmacological agents with agonist or antagonist action at these sites. This chapter outlines the required structural features for agonist and antagonist binding. It then discusses the various molecular modelling studies which have recently been carried out.
François Dehez
- Published in print:
- 2016
- Published Online:
- March 2016
- ISBN:
- 9780198752950
- eISBN:
- 9780191814426
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198752950.003.0013
- Subject:
- Physics, Soft Matter / Biological Physics
Accessing the full atomistic details of complex biological processes in near physiological conditions remains challenging at the experimental level. The development of molecular modeling approaches, ...
More
Accessing the full atomistic details of complex biological processes in near physiological conditions remains challenging at the experimental level. The development of molecular modeling approaches, such as molecular dynamics, together with powerful supercomputers allows for rationalizing and predicting at the molecular level the dynamics of large biomolecular assemblies in various conditions. Although still limited in terms of system size and time scales, molecular dynamics (MD) simulations have proven to be a valuable tool in the arsenal of integrated structural biology techniques. In this chapter, the standard concepts underlying molecular simulations are presented. First the basis of molecular mechanics and the concept of the force field are explained. Then the MD method is introduced together with some details of its implementation commonly employed to study biomolecular systems. Finally, the complementarity of experiments and modeling is illustrated using the example of mitochondrial carriers.Less
Accessing the full atomistic details of complex biological processes in near physiological conditions remains challenging at the experimental level. The development of molecular modeling approaches, such as molecular dynamics, together with powerful supercomputers allows for rationalizing and predicting at the molecular level the dynamics of large biomolecular assemblies in various conditions. Although still limited in terms of system size and time scales, molecular dynamics (MD) simulations have proven to be a valuable tool in the arsenal of integrated structural biology techniques. In this chapter, the standard concepts underlying molecular simulations are presented. First the basis of molecular mechanics and the concept of the force field are explained. Then the MD method is introduced together with some details of its implementation commonly employed to study biomolecular systems. Finally, the complementarity of experiments and modeling is illustrated using the example of mitochondrial carriers.
Raymond Brun
- Published in print:
- 2009
- Published Online:
- May 2009
- ISBN:
- 9780199552689
- eISBN:
- 9780191720277
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199552689.003.0003
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter defines collisional regimes of gaseous flows, including equilibrium and non-equilibrium situations which are described with Euler equations. It examines monatomic and diatomic gas flows, ...
More
This chapter defines collisional regimes of gaseous flows, including equilibrium and non-equilibrium situations which are described with Euler equations. It examines monatomic and diatomic gas flows, with particular attention to vibrationally non-equilibrium flows of pure gases and gas mixtures. Boltzmann and Treanor distributions are also discussed. The evolution equations of vibrational populations or mean vibrational energy are presented, in particular for the case of the harmonic oscillator model and Landau-Teller equation. General cases of reactive flows are examined and reaction rate constants are defined for reacting gases including or not internal energy. In the appendices, physical models for internal energy modes, properties of the Maxwell distribution, and particular vibrational relaxation equations are presented for pure gases and gas mixtures.Less
This chapter defines collisional regimes of gaseous flows, including equilibrium and non-equilibrium situations which are described with Euler equations. It examines monatomic and diatomic gas flows, with particular attention to vibrationally non-equilibrium flows of pure gases and gas mixtures. Boltzmann and Treanor distributions are also discussed. The evolution equations of vibrational populations or mean vibrational energy are presented, in particular for the case of the harmonic oscillator model and Landau-Teller equation. General cases of reactive flows are examined and reaction rate constants are defined for reacting gases including or not internal energy. In the appendices, physical models for internal energy modes, properties of the Maxwell distribution, and particular vibrational relaxation equations are presented for pure gases and gas mixtures.
Henk W. de Regt
- Published in print:
- 2017
- Published Online:
- August 2017
- ISBN:
- 9780190652913
- eISBN:
- 9780190652944
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780190652913.003.0006
- Subject:
- Philosophy, Philosophy of Science
This chapter analyzes the role of mechanical modeling in nineteenth-century physics, showing how precisely mechanical models were used to enhance scientific understanding. It discusses the work and ...
More
This chapter analyzes the role of mechanical modeling in nineteenth-century physics, showing how precisely mechanical models were used to enhance scientific understanding. It discusses the work and ideas of William Thomson (Lord Kelvin), James Clerk Maxwell, and Ludwig Boltzmann, who advanced explicit views on the function and status of mechanical models, in particular, on their role in providing understanding. A case study of the construction of molecular models to explain the so-called specific heat anomaly highlights the role of conceptual tools in achieving understanding and shows that intelligibility is an epistemically relevant feature of mechanical models. Next, the chapter examines Boltzmann’s Bildtheorie, an interpretation of mechanical models that he developed in response to problems and criticisms of the program of mechanical explanation, and his associated pragmatic conception of understanding. The final section discusses the limitations of mechanical models and Ernst Mach’s criticism of the mechanical program.Less
This chapter analyzes the role of mechanical modeling in nineteenth-century physics, showing how precisely mechanical models were used to enhance scientific understanding. It discusses the work and ideas of William Thomson (Lord Kelvin), James Clerk Maxwell, and Ludwig Boltzmann, who advanced explicit views on the function and status of mechanical models, in particular, on their role in providing understanding. A case study of the construction of molecular models to explain the so-called specific heat anomaly highlights the role of conceptual tools in achieving understanding and shows that intelligibility is an epistemically relevant feature of mechanical models. Next, the chapter examines Boltzmann’s Bildtheorie, an interpretation of mechanical models that he developed in response to problems and criticisms of the program of mechanical explanation, and his associated pragmatic conception of understanding. The final section discusses the limitations of mechanical models and Ernst Mach’s criticism of the mechanical program.
Andrea Woody and Clark Glymour
- Published in print:
- 2000
- Published Online:
- November 2020
- ISBN:
- 9780195128345
- eISBN:
- 9780197561416
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195128345.003.0008
- Subject:
- Chemistry, Theoretical Chemistry
In the late middle ages, chemistry was the science and technology closest to philosophy, the material realization of the method of analysis and synthesis. No longer. Contemporary philosophy is ...
More
In the late middle ages, chemistry was the science and technology closest to philosophy, the material realization of the method of analysis and synthesis. No longer. Contemporary philosophy is concerned with many sciences—physics, psychology, biology, linguistics, economics—but chemistry is not among them. Why not? Every discipline has particular problems with some philosophical coloring. Those in quantum theory are famous; those in psychology seem endless; those in biology and economics seem more sparse and esoteric. If, for whatever reason, one’s concern is the conceptual or theoretical problems of a particular science, there is no substitute for that science, and chemistry is just one among others. Certain sciences naturally touch on substantive areas of traditional philosophical concern: quantum theory on metaphysics, for example, psychology on the philosophy of mind, and economics and statistics on theories of rationality. In these cases, there is a special interest in particular sciences because they may reform prior philosophical theories or recast philosophical issues or, conversely, because philosophy may inform these subjects in fundamental ways. That is not true, in any obvious way, of chemistry. So what good, then, what special value, does chemistry offer contemporary philosophy of science? Typically philosophical problems, even problems in philosophy of science, are not confined to a particular science. For general problems—problems about representation, inference, discovery, explanation, realism, intertheoretic and interdisciplinary relations, and so on—what is needed are scientific illustrations that go to the heart of the matter without requiring specialized technical knowledge of the reader. The science needed for most philosophy is familiar, not esoteric, right in the middle of things, mature and diverse enough to illustrate a variety of fundamental issues. Almost uniquely, chemistry fits the description. In philosophy of science, too often an effort gains in weight and seriousness merely because it requires mastery of an intricate and arcane subject, regardless of the philosophical interest of what it says. Yet, surely, there is something contrived, even phony, in illustrating a philosophical point with a discussion of the top quark if the point could be shown as well with a discussion of the ideal gas law.
Less
In the late middle ages, chemistry was the science and technology closest to philosophy, the material realization of the method of analysis and synthesis. No longer. Contemporary philosophy is concerned with many sciences—physics, psychology, biology, linguistics, economics—but chemistry is not among them. Why not? Every discipline has particular problems with some philosophical coloring. Those in quantum theory are famous; those in psychology seem endless; those in biology and economics seem more sparse and esoteric. If, for whatever reason, one’s concern is the conceptual or theoretical problems of a particular science, there is no substitute for that science, and chemistry is just one among others. Certain sciences naturally touch on substantive areas of traditional philosophical concern: quantum theory on metaphysics, for example, psychology on the philosophy of mind, and economics and statistics on theories of rationality. In these cases, there is a special interest in particular sciences because they may reform prior philosophical theories or recast philosophical issues or, conversely, because philosophy may inform these subjects in fundamental ways. That is not true, in any obvious way, of chemistry. So what good, then, what special value, does chemistry offer contemporary philosophy of science? Typically philosophical problems, even problems in philosophy of science, are not confined to a particular science. For general problems—problems about representation, inference, discovery, explanation, realism, intertheoretic and interdisciplinary relations, and so on—what is needed are scientific illustrations that go to the heart of the matter without requiring specialized technical knowledge of the reader. The science needed for most philosophy is familiar, not esoteric, right in the middle of things, mature and diverse enough to illustrate a variety of fundamental issues. Almost uniquely, chemistry fits the description. In philosophy of science, too often an effort gains in weight and seriousness merely because it requires mastery of an intricate and arcane subject, regardless of the philosophical interest of what it says. Yet, surely, there is something contrived, even phony, in illustrating a philosophical point with a discussion of the top quark if the point could be shown as well with a discussion of the ideal gas law.
Zoltan Toroczkai and György Korniss
- Published in print:
- 2005
- Published Online:
- November 2020
- ISBN:
- 9780195177374
- eISBN:
- 9780197562260
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195177374.003.0020
- Subject:
- Computer Science, Mathematical Theory of Computation
In most cases, it is impossible to describe and understand complex system dynamics via analytical methods. The density of problems that are rigorously solvable with analytic tools is vanishingly ...
More
In most cases, it is impossible to describe and understand complex system dynamics via analytical methods. The density of problems that are rigorously solvable with analytic tools is vanishingly small in the set of all problems, and often the only way one can reliably obtain a system-level understanding of such problems is through direct simulation. This chapter broadens the discussion on the relationship between complexity and statistical physics by exploring how the computational scalability of parallelized simulation can be analyzed using a physical model of surface growth. Specifically, the systems considered here are made up of a large number of interacting individual elements with a finite number of attributes, or local state variables, each assuming a countable number (typically finite) of values. The dynamics of the local state variables are discrete events occurring in continuous time. Between two consecutive updates, the local variables stay unchanged. Another important assumption we make is that the interactions in the underlying system to be simulated have finite range. Examples of such systems include: magnetic systems (spin states and spin flip dynamics); surface growth via molecular beam epitaxy (height of the surface, molecular deposition, and diffusion dynamics); epidemiology (health of an individual, the dynamics of infection and recovery); financial markets (wealth state, buy/sell dynamics); and wireless communications or queueing systems (number of jobs, job arrival dynamics). Often—as in the case we study here—the dynamics of such systems are inherently stochastic and asynchronous. The simulation of such systems is nontrivial, and in most cases the complexity of the problem requires simulations on distributed architectures, defining the field of parallel discrete-event simulations (PDES) [186, 367, 416]. Conceptually, the computational task is divided among n processing elements (PEs), where each processor evolves the dynamics of the allocated piece. Due to the interactions among the individual elements of the simulated system (spins, atoms, packets, calls, etc.) the PEs must coordinate with a subset of other PEs during the simulation. For example, the state of a spin can only be updated if the state of the neighbors is known. However, some neighbors might belong to the computational domain of another PE, thus, message passing will be required in order to preserve causality.
Less
In most cases, it is impossible to describe and understand complex system dynamics via analytical methods. The density of problems that are rigorously solvable with analytic tools is vanishingly small in the set of all problems, and often the only way one can reliably obtain a system-level understanding of such problems is through direct simulation. This chapter broadens the discussion on the relationship between complexity and statistical physics by exploring how the computational scalability of parallelized simulation can be analyzed using a physical model of surface growth. Specifically, the systems considered here are made up of a large number of interacting individual elements with a finite number of attributes, or local state variables, each assuming a countable number (typically finite) of values. The dynamics of the local state variables are discrete events occurring in continuous time. Between two consecutive updates, the local variables stay unchanged. Another important assumption we make is that the interactions in the underlying system to be simulated have finite range. Examples of such systems include: magnetic systems (spin states and spin flip dynamics); surface growth via molecular beam epitaxy (height of the surface, molecular deposition, and diffusion dynamics); epidemiology (health of an individual, the dynamics of infection and recovery); financial markets (wealth state, buy/sell dynamics); and wireless communications or queueing systems (number of jobs, job arrival dynamics). Often—as in the case we study here—the dynamics of such systems are inherently stochastic and asynchronous. The simulation of such systems is nontrivial, and in most cases the complexity of the problem requires simulations on distributed architectures, defining the field of parallel discrete-event simulations (PDES) [186, 367, 416]. Conceptually, the computational task is divided among n processing elements (PEs), where each processor evolves the dynamics of the allocated piece. Due to the interactions among the individual elements of the simulated system (spins, atoms, packets, calls, etc.) the PEs must coordinate with a subset of other PEs during the simulation. For example, the state of a spin can only be updated if the state of the neighbors is known. However, some neighbors might belong to the computational domain of another PE, thus, message passing will be required in order to preserve causality.
Eva Pebay-Peyroula, Hugues Nury, François Parcy, Rob W. H. Ruigrok, Christine Ziegler, and Leticia F. Cugliandolo (eds)
- Published in print:
- 2016
- Published Online:
- March 2016
- ISBN:
- 9780198752950
- eISBN:
- 9780191814426
- Item type:
- book
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198752950.001.0001
- Subject:
- Physics, Soft Matter / Biological Physics
The aim of this book is to provide new ideas for studying living matter by a simultaneous understanding of behavior from molecules to the cell, to the whole organism in the light of physical ...
More
The aim of this book is to provide new ideas for studying living matter by a simultaneous understanding of behavior from molecules to the cell, to the whole organism in the light of physical concepts. Indeed, forces guide most biological phenomena. In some cases these forces can be well-described and thus used to model a particular biological phenomenon. This is exemplified here by the study of membranes, where their shapes and curvatures can be modeled using a limited number of parameters that are measured experimentally. The growth of plants is another example where the combination of physics, biology and mathematics leads to a predictive model. The laws of thermodynamics are essential, as they dictate the behavior of proteins, or more generally biological molecules, in an aqueous environment. Integrated studies from the molecule to a larger scale need a combination of cutting-edge approaches, such as the use of new X-ray sources, in-cell NMR, cryo-electron microscopy or single-molecule microscopy. Some are described in dedicated chapters while others are mentioned in discussion of particular topics, such as the interactions between HIV and host cells which are being progressively deciphered thanks to recent developments in various types of microscopy. All the concepts and methods developed in this book are illustrated alongside three main biological questions: host–pathogen interactions, plant development and flowering and membrane processes. Through these examples, the book intends to highlight how integrated biology including physics and mathematics is a very powerful approach.Less
The aim of this book is to provide new ideas for studying living matter by a simultaneous understanding of behavior from molecules to the cell, to the whole organism in the light of physical concepts. Indeed, forces guide most biological phenomena. In some cases these forces can be well-described and thus used to model a particular biological phenomenon. This is exemplified here by the study of membranes, where their shapes and curvatures can be modeled using a limited number of parameters that are measured experimentally. The growth of plants is another example where the combination of physics, biology and mathematics leads to a predictive model. The laws of thermodynamics are essential, as they dictate the behavior of proteins, or more generally biological molecules, in an aqueous environment. Integrated studies from the molecule to a larger scale need a combination of cutting-edge approaches, such as the use of new X-ray sources, in-cell NMR, cryo-electron microscopy or single-molecule microscopy. Some are described in dedicated chapters while others are mentioned in discussion of particular topics, such as the interactions between HIV and host cells which are being progressively deciphered thanks to recent developments in various types of microscopy. All the concepts and methods developed in this book are illustrated alongside three main biological questions: host–pathogen interactions, plant development and flowering and membrane processes. Through these examples, the book intends to highlight how integrated biology including physics and mathematics is a very powerful approach.