*Christof Koch*

- Published in print:
- 1998
- Published Online:
- November 2020
- ISBN:
- 9780195104912
- eISBN:
- 9780197562338
- Item type:
- chapter

- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195104912.003.0027
- Subject:
- Computer Science, Mathematical Theory of Computation

We now have arrived at the end of the book. The first 16 chapters dealt with linear and nonlinear cable theory, voltage-dependent ionic currents, the biophysical ...
More

We now have arrived at the end of the book. The first 16 chapters dealt with linear and nonlinear cable theory, voltage-dependent ionic currents, the biophysical origin of spike initiation and propagation, the statistical properties of spike trains and neural coding, bursting, dendritic spines, synaptic transmission and plasticity, the types of interactions that can occur among synaptic inputs in a passive or active dendritic arbor, and the diffusion and buffering of calcium and other ions. We attempted to weave these disparate threads into a single tapestry in Chaps. 17-19, demonstrating how these elements interact within a single neuron. The penultimate chapter dealt with various unconventional biophysical and biochemical mechanisms that could instantiate computations at the molecular and the network levels. It is time to summarize. What have we learned about the way brains do or do not compute? The brain has frequently been compared to a universal Turing machine (for a very lucid account of this, see Hofstadter, 1979). A Turing machine is a mathematical abstraction meant to clarify what is meant by algorithm, computation, and computable. Think of it as a machine with a finite number of internal states and an infinite tape that can read messages composed with a finite alphabet, write an output, and store intermediate results as memory. A universal Turing machine is one that can mimic any arbitrary Turing machine. We are here not interested in the renewed debate as to whether or not the brain can, in principle, be treated as such a machine (Lucas, 1964; Penrose, 1989), but whether this is a useful way to conceptualize nervous systems in this manner. Because brains have limited precision, only finite amounts of memory and do not live forever, they cannot possibly be like “real” Turing machines. It is therefore more appropriate to ask: to what extent can brains be treated as finite state machines or automata! Such a machine only has finite computational and memory resources (Hopcroft and Ullman, 1979). The answer has to be an ambiguous “it depends.”
Less

We now have arrived at the end of the book. The first 16 chapters dealt with linear and nonlinear cable theory, voltage-dependent ionic currents, the biophysical origin of spike initiation and propagation, the statistical properties of spike trains and neural coding, bursting, dendritic spines, synaptic transmission and plasticity, the types of interactions that can occur among synaptic inputs in a passive or active dendritic arbor, and the diffusion and buffering of calcium and other ions. We attempted to weave these disparate threads into a single tapestry in Chaps. 17-19, demonstrating how these elements interact within a single neuron. The penultimate chapter dealt with various unconventional biophysical and biochemical mechanisms that could instantiate computations at the molecular and the network levels. It is time to summarize. What have we learned about the way brains do or do not compute? The brain has frequently been compared to a universal Turing machine (for a very lucid account of this, see Hofstadter, 1979). A Turing machine is a mathematical abstraction meant to clarify what is meant by algorithm, computation, and computable. Think of it as a machine with a finite number of internal states and an infinite tape that can read messages composed with a finite alphabet, write an output, and store intermediate results as memory. A universal Turing machine is one that can mimic any arbitrary Turing machine. We are here not interested in the renewed debate as to whether or not the brain can, in principle, be treated as such a machine (Lucas, 1964; Penrose, 1989), but whether this is a useful way to conceptualize nervous systems in this manner. Because brains have limited precision, only finite amounts of memory and do not live forever, they cannot possibly be like “real” Turing machines. It is therefore more appropriate to ask: to what extent can brains be treated as finite state machines or automata! Such a machine only has finite computational and memory resources (Hopcroft and Ullman, 1979). The answer has to be an ambiguous “it depends.”

*Fred V. Brock and Scott J. Richardson*

- Published in print:
- 2001
- Published Online:
- November 2020
- ISBN:
- 9780195134513
- eISBN:
- 9780197561584
- Item type:
- chapter

- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780195134513.003.0005
- Subject:
- Earth Sciences and Geography, Meteorology and Climatology

Sensor performance characteristics are generally divided into at least two categories: static and dynamic. Additional categories sometimes used include drift and ...
More

Sensor performance characteristics are generally divided into at least two categories: static and dynamic. Additional categories sometimes used include drift and exposure errors. The performance of sensors in conditions where the measurand is constant or very slowly changing can be characterized by static parameters. Dynamic performance modeling requires the use of differential equations to account for the relation between sensor input and output when the input is rapidly varying. Static characteristics due to friction or other nonlinear effects would vastly complicate the differential equations so, even when the input is not steady, static and dynamic characteristics are considered separately. Static characteristics are determined by carefully excluding dynamic effects. Dynamic characteristics are assessed by assuming that all static effects have been excluded or compensated. Many of these terms have been encountered in chaps. 1 and 2, although without formal definitions. Analog signal. A signal whose information content is continuously proportional to the measurand. If an electrical temperature sensor has a voltage output, that voltage signal fluctuates with the sensor temperature. Voltage output would be continuously proportional to the measurand (temperature) and is analogous to it, hence we refer to the sensor output as an analog signal. Data display. Any mechanism for displaying data to the user. The stem of a mercury-in-glass thermometer with attached scale is a data display. Data storage. A memory element or mechanism for holding data and later recovering them such as a disk or magnetic tape. Again, this could be as simple as a piece of paper. Data transmission. The process of sending a signal from one place to another. The data transmission medium could be a piece of paper, a magnetic tape, radio or light waves, or telephone wires. Digital signal. A signal whose information content varies in discrete steps. The step size can be made arbitrarily small such that a plot of a digitized signal could also resemble the analog signal. However, the granularity of a digital signal will be revealed if it is examined in sufficient detail.
Less

Sensor performance characteristics are generally divided into at least two categories: static and dynamic. Additional categories sometimes used include drift and exposure errors. The performance of sensors in conditions where the measurand is constant or very slowly changing can be characterized by static parameters. Dynamic performance modeling requires the use of differential equations to account for the relation between sensor input and output when the input is rapidly varying. Static characteristics due to friction or other nonlinear effects would vastly complicate the differential equations so, even when the input is not steady, static and dynamic characteristics are considered separately. Static characteristics are determined by carefully excluding dynamic effects. Dynamic characteristics are assessed by assuming that all static effects have been excluded or compensated. Many of these terms have been encountered in chaps. 1 and 2, although without formal definitions. Analog signal. A signal whose information content is continuously proportional to the measurand. If an electrical temperature sensor has a voltage output, that voltage signal fluctuates with the sensor temperature. Voltage output would be continuously proportional to the measurand (temperature) and is analogous to it, hence we refer to the sensor output as an analog signal. Data display. Any mechanism for displaying data to the user. The stem of a mercury-in-glass thermometer with attached scale is a data display. Data storage. A memory element or mechanism for holding data and later recovering them such as a disk or magnetic tape. Again, this could be as simple as a piece of paper. Data transmission. The process of sending a signal from one place to another. The data transmission medium could be a piece of paper, a magnetic tape, radio or light waves, or telephone wires. Digital signal. A signal whose information content varies in discrete steps. The step size can be made arbitrarily small such that a plot of a digitized signal could also resemble the analog signal. However, the granularity of a digital signal will be revealed if it is examined in sufficient detail.

*Andrea Moro*

- Published in print:
- 2016
- Published Online:
- May 2017
- ISBN:
- 9780262034890
- eISBN:
- 9780262335621
- Item type:
- chapter

- Publisher:
- The MIT Press
- DOI:
- 10.7551/mitpress/9780262034890.003.0005
- Subject:
- Linguistics, Psycholinguistics / Neurolinguistics / Cognitive Linguistics

One of the major discoveries of modern linguistics is that languages do not vary arbitrarily: for example, all syntactic rules must be based on hierarchical structure generated by recursive procedure ...
More

One of the major discoveries of modern linguistics is that languages do not vary arbitrarily: for example, all syntactic rules must be based on hierarchical structure generated by recursive procedure rather than linear order. Neuroimaging techniques have shown that these formal restrictions constituting the boundaries of Babel are in fact represented in the brain for people who learn non-recursive artificially designed rules do not involve those neural circuits that underpin language computation. The boundaries of Babels cannot be cultural and arbitrary.Less

One of the major discoveries of modern linguistics is that languages do not vary arbitrarily: for example, all syntactic rules must be based on hierarchical structure generated by recursive procedure rather than linear order. Neuroimaging techniques have shown that these formal restrictions constituting the boundaries of Babel are in fact represented in the brain for people who learn non-recursive artificially designed rules do not involve those neural circuits that underpin language computation. The boundaries of Babels cannot be cultural and arbitrary.