Mathew Penrose
- Published in print:
- 2003
- Published Online:
- September 2007
- ISBN:
- 9780198506263
- eISBN:
- 9780191707858
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198506263.003.0001
- Subject:
- Mathematics, Probability / Statistics
This introductory chapter contains a general discussion of both the historical and the applied background behind the study of random geometric graphs. A brief overview is presented, along with some ...
More
This introductory chapter contains a general discussion of both the historical and the applied background behind the study of random geometric graphs. A brief overview is presented, along with some standard definitions in graph theory and probability theory. Specific terminology is introduced for two limiting regimes in the choice of r=r(n) (namely the thermodynamic limit where the mean vertex degree is made to approach a finite constant) and the connectivity regime (where it grows logarithmically with n). Some elementary probabilistic results are given on large deviations for the binomial and Poisson distribution, and on Poisson point processes.Less
This introductory chapter contains a general discussion of both the historical and the applied background behind the study of random geometric graphs. A brief overview is presented, along with some standard definitions in graph theory and probability theory. Specific terminology is introduced for two limiting regimes in the choice of r=r(n) (namely the thermodynamic limit where the mean vertex degree is made to approach a finite constant) and the connectivity regime (where it grows logarithmically with n). Some elementary probabilistic results are given on large deviations for the binomial and Poisson distribution, and on Poisson point processes.
M. E. J. Newman
- Published in print:
- 2010
- Published Online:
- September 2010
- ISBN:
- 9780199206650
- eISBN:
- 9780191594175
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199206650.003.0013
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
The previous chapter looked at the classic random graph model, in which pairs of vertices are connected at random with uniform probabilities. Although this model has proved tremendously useful as a ...
More
The previous chapter looked at the classic random graph model, in which pairs of vertices are connected at random with uniform probabilities. Although this model has proved tremendously useful as a source of insight into the structure of networks, it also has a number of serious shortcomings. Chief among these is its degree distribution, which follows the Poisson distribution which is quite different from the degree distributions seen in most real-world networks. This chapter shows how to create more sophisticated random graph models, which incorporate arbitrary degree distributions and yet are still exactly solvable for many of their properties in the limit of large network size. The fundamental mathematical tool used to derive the results of this chapter is the probability generating function. Exercises are provided at the end of the chapter.Less
The previous chapter looked at the classic random graph model, in which pairs of vertices are connected at random with uniform probabilities. Although this model has proved tremendously useful as a source of insight into the structure of networks, it also has a number of serious shortcomings. Chief among these is its degree distribution, which follows the Poisson distribution which is quite different from the degree distributions seen in most real-world networks. This chapter shows how to create more sophisticated random graph models, which incorporate arbitrary degree distributions and yet are still exactly solvable for many of their properties in the limit of large network size. The fundamental mathematical tool used to derive the results of this chapter is the probability generating function. Exercises are provided at the end of the chapter.
Alexandra M. Schmidt and Marco A. Rodríguez
- Published in print:
- 2011
- Published Online:
- January 2012
- ISBN:
- 9780199694587
- eISBN:
- 9780191731921
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199694587.003.0020
- Subject:
- Mathematics, Probability / Statistics
We discuss models for multivariate counts observed at fixed spatial locations of a region of interest. Our approach is based on a continuous mixture of independent Poisson distributions. The mixing ...
More
We discuss models for multivariate counts observed at fixed spatial locations of a region of interest. Our approach is based on a continuous mixture of independent Poisson distributions. The mixing component is able to capture correlation among components of the observed vector and across space through the use of a linear model of coregionalization. We introduce here the use of covariates to allow for possible non‐stationarity of the covariance structure of the mixing component. We analyse joint spatial variation of counts of four fish species abundant in Lake Saint Pierre, Quebec, Canada. Models allowing the covariance structure of the spatial random effects to depend on a covariate, geodetic lake depth, showed improved fit relative to stationary models.Less
We discuss models for multivariate counts observed at fixed spatial locations of a region of interest. Our approach is based on a continuous mixture of independent Poisson distributions. The mixing component is able to capture correlation among components of the observed vector and across space through the use of a linear model of coregionalization. We introduce here the use of covariates to allow for possible non‐stationarity of the covariance structure of the mixing component. We analyse joint spatial variation of counts of four fish species abundant in Lake Saint Pierre, Quebec, Canada. Models allowing the covariance structure of the spatial random effects to depend on a covariate, geodetic lake depth, showed improved fit relative to stationary models.
Steve Selvin
- Published in print:
- 2004
- Published Online:
- September 2009
- ISBN:
- 9780195172805
- eISBN:
- 9780199865697
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780195172805.003.03
- Subject:
- Public Health and Epidemiology, Public Health, Epidemiology
This chapter discusses the power of a statistical test and the closely related issue of sample size. Topics covered include normal distribution, Poisson distribution, sample size, and loss of ...
More
This chapter discusses the power of a statistical test and the closely related issue of sample size. Topics covered include normal distribution, Poisson distribution, sample size, and loss of statistical power and bias from grouping continuous data.Less
This chapter discusses the power of a statistical test and the closely related issue of sample size. Topics covered include normal distribution, Poisson distribution, sample size, and loss of statistical power and bias from grouping continuous data.
Sergey N. Dorogovtsev
- Published in print:
- 2010
- Published Online:
- May 2010
- ISBN:
- 9780199548927
- eISBN:
- 9780191720574
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199548927.003.0002
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter gives insight into the simplest and most studied random networks: the classical random graphs. The Erdös–Rényi and Gilbert models are described, and some of their properties and ...
More
This chapter gives insight into the simplest and most studied random networks: the classical random graphs. The Erdös–Rényi and Gilbert models are described, and some of their properties and characteristics are considered. These characteristics include a Poisson degree distribution, the number of loops and clustering, and average shortest-path length. The statistics of connected components in these random networks and the birth of a giant connected component, are considered.Less
This chapter gives insight into the simplest and most studied random networks: the classical random graphs. The Erdös–Rényi and Gilbert models are described, and some of their properties and characteristics are considered. These characteristics include a Poisson degree distribution, the number of loops and clustering, and average shortest-path length. The statistics of connected components in these random networks and the birth of a giant connected component, are considered.
S. N. Dorogovtsev and J. F. F. Mendes
- Published in print:
- 2003
- Published Online:
- January 2010
- ISBN:
- 9780198515906
- eISBN:
- 9780191705670
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198515906.003.0002
- Subject:
- Physics, Soft Matter / Biological Physics
This chapter introduces the basic characteristics and notions of graph theory and the science of networks: degree, degree distribution, clustering coefficient, the average length of the shortest path ...
More
This chapter introduces the basic characteristics and notions of graph theory and the science of networks: degree, degree distribution, clustering coefficient, the average length of the shortest path between two nodes in a network, the size of a giant connected component, and others. The classical random graphs and their characteristics are introduced and explained. The contrasting types of degree distributions are discussed: namely, Poisson and scale-free distributions.Less
This chapter introduces the basic characteristics and notions of graph theory and the science of networks: degree, degree distribution, clustering coefficient, the average length of the shortest path between two nodes in a network, the size of a giant connected component, and others. The classical random graphs and their characteristics are introduced and explained. The contrasting types of degree distributions are discussed: namely, Poisson and scale-free distributions.
Peter Coles
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780198567622
- eISBN:
- 9780191718250
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567622.003.0005
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
This chapter examines randomness and how it applies (or does not) in both abstract mathematics and in physical systems. There are many different ways in which a sequence of events could be said to be ...
More
This chapter examines randomness and how it applies (or does not) in both abstract mathematics and in physical systems. There are many different ways in which a sequence of events could be said to be ‘random’. The mathematical theory of random processes, sometimes called stochastic processes, depends on being able to construct joint probabilities of large sequences of random variables, which can be very tricky to say the least. There are, however, some kinds of random processes where the theory is relatively straightforward. One class is when the sequence has no memory at all; this type of sequence is sometimes called white noise. Random processes can be either stationary or ergodic. The chapter also discusses predictability in principle and practice, and explains why pulling numbers out of an address book leads to a distribution of first digits that is not at all uniform. Aside from sequences of variables, other manifestations of randomness include points, patterns, and Poisson distribution.Less
This chapter examines randomness and how it applies (or does not) in both abstract mathematics and in physical systems. There are many different ways in which a sequence of events could be said to be ‘random’. The mathematical theory of random processes, sometimes called stochastic processes, depends on being able to construct joint probabilities of large sequences of random variables, which can be very tricky to say the least. There are, however, some kinds of random processes where the theory is relatively straightforward. One class is when the sequence has no memory at all; this type of sequence is sometimes called white noise. Random processes can be either stationary or ergodic. The chapter also discusses predictability in principle and practice, and explains why pulling numbers out of an address book leads to a distribution of first digits that is not at all uniform. Aside from sequences of variables, other manifestations of randomness include points, patterns, and Poisson distribution.
Steven J. Osterlind
- Published in print:
- 2019
- Published Online:
- January 2019
- ISBN:
- 9780198831600
- eISBN:
- 9780191869532
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198831600.003.0011
- Subject:
- Mathematics, Logic / Computer Science / Mathematical Philosophy
This chapter shows that the quantification worldview was not spreading uniformly across even the developed world. In the United States, the spread was slow, because nearly everyone was consumed with ...
More
This chapter shows that the quantification worldview was not spreading uniformly across even the developed world. In the United States, the spread was slow, because nearly everyone was consumed with making a new country. It was also slow in Russia, because of the dominance of the Romanoff family and their wars, and it did not spread to the Middle East, because it was preoccupied with warring and had almost no widespread education system. In England, an engineer named Charles Babbage contributed to bringing a mindset of quantification to ordinary people, through his Difference Engines Nos. 1 and 2, possibly the first programmable computers with CPUs. In probability theory, statistical “rare events” are described, and how Siméon Poisson made them into a specialized distribution is explained simply, as is his Poisson distribution, illustrated by the famous example of Prussian horse kicks. Further advancements included the invention of Fourier transforms.Less
This chapter shows that the quantification worldview was not spreading uniformly across even the developed world. In the United States, the spread was slow, because nearly everyone was consumed with making a new country. It was also slow in Russia, because of the dominance of the Romanoff family and their wars, and it did not spread to the Middle East, because it was preoccupied with warring and had almost no widespread education system. In England, an engineer named Charles Babbage contributed to bringing a mindset of quantification to ordinary people, through his Difference Engines Nos. 1 and 2, possibly the first programmable computers with CPUs. In probability theory, statistical “rare events” are described, and how Siméon Poisson made them into a specialized distribution is explained simply, as is his Poisson distribution, illustrated by the famous example of Prussian horse kicks. Further advancements included the invention of Fourier transforms.
Andy Hector
- Published in print:
- 2021
- Published Online:
- August 2021
- ISBN:
- 9780198798170
- eISBN:
- 9780191839399
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198798170.003.0016
- Subject:
- Biology, Biomathematics / Statistics and Data Analysis / Complexity Studies, Ecology
GLMs using the Poisson distribution are a good starting place when dealing with integer count data. The default log link function prevents the prediction of negative counts and the Poisson ...
More
GLMs using the Poisson distribution are a good starting place when dealing with integer count data. The default log link function prevents the prediction of negative counts and the Poisson distribution models the variance (approximately equal to the mean).Less
GLMs using the Poisson distribution are a good starting place when dealing with integer count data. The default log link function prevents the prediction of negative counts and the Poisson distribution models the variance (approximately equal to the mean).
James B. Elsner and Thomas H. Jagger
- Published in print:
- 2013
- Published Online:
- November 2020
- ISBN:
- 9780199827633
- eISBN:
- 9780197563199
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780199827633.003.0011
- Subject:
- Earth Sciences and Geography, Meteorology and Climatology
Here in Part II, we focus on statistical models for understanding and predicting hurricane climate. This chapter shows you how to model hurricane occurrence. This is done using the annual count of ...
More
Here in Part II, we focus on statistical models for understanding and predicting hurricane climate. This chapter shows you how to model hurricane occurrence. This is done using the annual count of hurricanes making landfall in the United States. We also consider the occurrence of hurricanes across the basin and by origin. We begin with exploratory analysis and then show you how to model counts with Poisson regression. Issues of model fit, interpretation, and prediction are considered in turn. The topic of how to assess forecast skill is examined including how to perform cross-validation. Alternatives to the Poisson regression model are considered. Logistic regression and receiver operating characteristics (ROCS) are also covered. You use the data set US.txt which contains a list of tropical cyclone counts by year (see Chapter 2). The counts indicate the number of hurricanes hitting in the United States (excluding Hawaii). Input the data, save them as a data frame object, and print out the first six lines by typing . . . > H = read.table("US.txt", header=TRUE) > head(H) . . . The columns include year Year, number of U.S. hurricanes All, number of major U.S. hurricanes MUS, number of U.S. Gulf coast hurricanes G, number of Florida hurricanes FL, and number of East coast hurricanes E. Save the total number of years in the record as n and the average number hurricanes per year as rate. . . . > n = length(H$Year); rate = mean(H$All) > n; rate [1] 160 [1] 1.69 . . . The average number of U.S. hurricanes is 1.69 per year over these 160 years. First plot a time series and a distribution of the annual counts. Together, the two plots provide a nice summary of the information in your data relevant to any modeling effort. . . . > par(las=1) > layout(matrix(c(1, 2), 1, 2, byrow=TRUE), + widths=c(3/5, 2/5)) > plot(H$Year, H$All, type="h", xlab="Year", + ylab="Hurricane Count") > grid() > mtext("a", side=3, line=1, adj=0, cex=1.1) > barplot(table(H$All), xlab="Hurricane Count", + ylab="Number of Years", main="") > mtext("b", side=3, line=1, adj=0, cex=1.1) . . . The layout function divides the plot page into rows and columns as specified in the matrix function (first argument).
Less
Here in Part II, we focus on statistical models for understanding and predicting hurricane climate. This chapter shows you how to model hurricane occurrence. This is done using the annual count of hurricanes making landfall in the United States. We also consider the occurrence of hurricanes across the basin and by origin. We begin with exploratory analysis and then show you how to model counts with Poisson regression. Issues of model fit, interpretation, and prediction are considered in turn. The topic of how to assess forecast skill is examined including how to perform cross-validation. Alternatives to the Poisson regression model are considered. Logistic regression and receiver operating characteristics (ROCS) are also covered. You use the data set US.txt which contains a list of tropical cyclone counts by year (see Chapter 2). The counts indicate the number of hurricanes hitting in the United States (excluding Hawaii). Input the data, save them as a data frame object, and print out the first six lines by typing . . . > H = read.table("US.txt", header=TRUE) > head(H) . . . The columns include year Year, number of U.S. hurricanes All, number of major U.S. hurricanes MUS, number of U.S. Gulf coast hurricanes G, number of Florida hurricanes FL, and number of East coast hurricanes E. Save the total number of years in the record as n and the average number hurricanes per year as rate. . . . > n = length(H$Year); rate = mean(H$All) > n; rate [1] 160 [1] 1.69 . . . The average number of U.S. hurricanes is 1.69 per year over these 160 years. First plot a time series and a distribution of the annual counts. Together, the two plots provide a nice summary of the information in your data relevant to any modeling effort. . . . > par(las=1) > layout(matrix(c(1, 2), 1, 2, byrow=TRUE), + widths=c(3/5, 2/5)) > plot(H$Year, H$All, type="h", xlab="Year", + ylab="Hurricane Count") > grid() > mtext("a", side=3, line=1, adj=0, cex=1.1) > barplot(table(H$All), xlab="Hurricane Count", + ylab="Number of Years", main="") > mtext("b", side=3, line=1, adj=0, cex=1.1) . . . The layout function divides the plot page into rows and columns as specified in the matrix function (first argument).
Melvin Lax, Wei Cai, and Min Xu
- Published in print:
- 2006
- Published Online:
- January 2010
- ISBN:
- 9780198567769
- eISBN:
- 9780191718359
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567769.003.0013
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
The random motion of particles in a turbid medium, due to multiple elastic scattering, obeys the classic Boltzmann transport equation. This chapter shows how the center position and the diffusion ...
More
The random motion of particles in a turbid medium, due to multiple elastic scattering, obeys the classic Boltzmann transport equation. This chapter shows how the center position and the diffusion coefficients of an incident collimated beam into an infinite uniform turbid medium are derived using an elementary analysis of the random walk of photons in a turbid medium. Light propagation in a multiple scattering (turbid) medium such as the atmosphere, colloidal suspensions, and biological tissue is commonly treated by the theory of radiative transfer. The basic equation of radiative transfer is the elastic Boltzmann equation, a nonseparable integro-differential equation of first order for which an exact closed form solution is not known except for the case for isotropic scatterers. Solutions are often based on truncation of the spherical harmonics expansion of the photon distribution function or resort to numerical calculation including Monte Carlo simulations. Macroscopic and microscopic statistics in the direction space are also discussed, along with the generalised Poisson distribution.Less
The random motion of particles in a turbid medium, due to multiple elastic scattering, obeys the classic Boltzmann transport equation. This chapter shows how the center position and the diffusion coefficients of an incident collimated beam into an infinite uniform turbid medium are derived using an elementary analysis of the random walk of photons in a turbid medium. Light propagation in a multiple scattering (turbid) medium such as the atmosphere, colloidal suspensions, and biological tissue is commonly treated by the theory of radiative transfer. The basic equation of radiative transfer is the elastic Boltzmann equation, a nonseparable integro-differential equation of first order for which an exact closed form solution is not known except for the case for isotropic scatterers. Solutions are often based on truncation of the spherical harmonics expansion of the photon distribution function or resort to numerical calculation including Monte Carlo simulations. Macroscopic and microscopic statistics in the direction space are also discussed, along with the generalised Poisson distribution.
Andy Hector
- Published in print:
- 2015
- Published Online:
- March 2015
- ISBN:
- 9780198729051
- eISBN:
- 9780191795855
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198729051.003.0009
- Subject:
- Biology, Biomathematics / Statistics and Data Analysis / Complexity Studies, Ecology
This chapter looks at three of the main types of generalized linear model (GLM). GLMs using the Poisson distribution are a good starting place when dealing with integer count data. The default log ...
More
This chapter looks at three of the main types of generalized linear model (GLM). GLMs using the Poisson distribution are a good starting place when dealing with integer count data. The default log link function prevents the prediction of negative counts and the Poisson distribution models the variance (approximately equal to the mean). GLMs with a binomial distribution are designed for the analysis of binomial counts (how many times something occurred relative to the total number of possible times it could have occurred). A logistic link function constrains predictions to be above zero and below the maximum using the S-shaped logistic curve. Overdispersion can be diagnosed and dealt with using a quasi-maximum likelihood extension to GLM analysis. Binomial GLMs can also be used to analyse binary data as a special case with some minor differences to the analysis introduced by the constrained nature of the binary data.Less
This chapter looks at three of the main types of generalized linear model (GLM). GLMs using the Poisson distribution are a good starting place when dealing with integer count data. The default log link function prevents the prediction of negative counts and the Poisson distribution models the variance (approximately equal to the mean). GLMs with a binomial distribution are designed for the analysis of binomial counts (how many times something occurred relative to the total number of possible times it could have occurred). A logistic link function constrains predictions to be above zero and below the maximum using the S-shaped logistic curve. Overdispersion can be diagnosed and dealt with using a quasi-maximum likelihood extension to GLM analysis. Binomial GLMs can also be used to analyse binary data as a special case with some minor differences to the analysis introduced by the constrained nature of the binary data.
J. Durbin and S.J. Koopman
- Published in print:
- 2012
- Published Online:
- December 2013
- ISBN:
- 9780199641178
- eISBN:
- 9780191774881
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199641178.003.0014
- Subject:
- Mathematics, Probability / Statistics
This chapter discusses examples which illustrate the methods that were developed in Part II for analysing observations using non-Gaussian and nonlinear state space models. These include the monthly ...
More
This chapter discusses examples which illustrate the methods that were developed in Part II for analysing observations using non-Gaussian and nonlinear state space models. These include the monthly number of van drivers killed in road accidents in Great Britain modelled by a Poisson distribution; the usefulness of the t-distribution for modelling observation errors in a gas consumption series containing outliers; the volatility of exchange rate returns; and fitting a binary model to the results of the annual boat race between teams of the universities of Oxford and Cambridge.Less
This chapter discusses examples which illustrate the methods that were developed in Part II for analysing observations using non-Gaussian and nonlinear state space models. These include the monthly number of van drivers killed in road accidents in Great Britain modelled by a Poisson distribution; the usefulness of the t-distribution for modelling observation errors in a gas consumption series containing outliers; the volatility of exchange rate returns; and fitting a binary model to the results of the annual boat race between teams of the universities of Oxford and Cambridge.
Carey Witkov and Keith Zengel
- Published in print:
- 2019
- Published Online:
- November 2019
- ISBN:
- 9780198847144
- eISBN:
- 9780191882074
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198847144.003.0007
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics, Particle Physics / Astrophysics / Cosmology
A variety of advanced topics are introduced to offer greater challenge for beginners and to answer thorny questions often asked by early researchers who are just starting to use chi-squared analysis. ...
More
A variety of advanced topics are introduced to offer greater challenge for beginners and to answer thorny questions often asked by early researchers who are just starting to use chi-squared analysis. Topics covered include probability density functions, p-values, the derivation of the chi-squared probability density function and its uses, reduced chi-squared, the Poisson distribution, and advanced techniques for maximum likelihood estimation in cases where uncertainties are not Gaussian or the model is nonlinear. Problems are included (with solutions in an appendix).Less
A variety of advanced topics are introduced to offer greater challenge for beginners and to answer thorny questions often asked by early researchers who are just starting to use chi-squared analysis. Topics covered include probability density functions, p-values, the derivation of the chi-squared probability density function and its uses, reduced chi-squared, the Poisson distribution, and advanced techniques for maximum likelihood estimation in cases where uncertainties are not Gaussian or the model is nonlinear. Problems are included (with solutions in an appendix).
Warren Nagourney
- Published in print:
- 2014
- Published Online:
- June 2014
- ISBN:
- 9780199665488
- eISBN:
- 9780191779442
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199665488.003.0006
- Subject:
- Physics, Atomic, Laser, and Optical Physics
This chapter combines the results of Chapters 3 and 5 by placing an amplifying medium in a resonant cavity and evaluating the behavior of the resulting laser oscillator. First, the conditions for ...
More
This chapter combines the results of Chapters 3 and 5 by placing an amplifying medium in a resonant cavity and evaluating the behavior of the resulting laser oscillator. First, the conditions for laser oscillation and the expected power output of the laser are determined. Next, optical pumping techniques for three-level and four-level lasers are discussed. Finally, the spectral characteristics of laser radiation are considered. This includes the frequency pulling of the empty cavity modes, the multimode behavior of an inhomogeneously broadened laser (including spatial hole burning) and the ultimate linewidth (Schawlow–Townes limit) of a laser. The remarkable behavior of the photon occupation number above threshold is also discussed, and the fairly subtle distinction between the photon statistics of a coherent laser beam and that of thermal radiation is treated in some detail.Less
This chapter combines the results of Chapters 3 and 5 by placing an amplifying medium in a resonant cavity and evaluating the behavior of the resulting laser oscillator. First, the conditions for laser oscillation and the expected power output of the laser are determined. Next, optical pumping techniques for three-level and four-level lasers are discussed. Finally, the spectral characteristics of laser radiation are considered. This includes the frequency pulling of the empty cavity modes, the multimode behavior of an inhomogeneously broadened laser (including spatial hole burning) and the ultimate linewidth (Schawlow–Townes limit) of a laser. The remarkable behavior of the photon occupation number above threshold is also discussed, and the fairly subtle distinction between the photon statistics of a coherent laser beam and that of thermal radiation is treated in some detail.
Stephen Barnett
- Published in print:
- 2009
- Published Online:
- November 2020
- ISBN:
- 9780198527626
- eISBN:
- 9780191916625
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198527626.003.0006
- Subject:
- Computer Science, Mathematical Theory of Computation
The practical implementation of quantum information technologies requires, for the most part, highly advanced and currently experimental procedures. One exception is quantum cryptography, or ...
More
The practical implementation of quantum information technologies requires, for the most part, highly advanced and currently experimental procedures. One exception is quantum cryptography, or quantum key distribution, which has been successfully demonstrated in many laboratories and has reached an advanced level of development. It will probably become the first commercial application of quantum information. In quantum key distribution, Alice and Bob exploit a quantum channel to create a secret shared key comprising a random string of binary digits. This key can then be used to protect a subsequent communication between them. The principal idea is that the secrecy of the key distribution is ensured by the laws of quantum physics. Proving security for practical communication systems is a challenging problem and requires techniques that are beyond the scope of this book. At a fundamental level, however, the ideas are simple and may readily be understood with the knowledge we have already acquired. Quantum cryptography is the latest idea in the long history of secure (and not so secure) communications and, if it is to develop, it will have to compete with existing technologies. For this reason we begin with a brief survey of the history and current state of the art in secure communications before turning to the possibilities offered by quantum communications. The history of cryptography is a long and fascinating one. As a consequence of the success or, more spectacularly, the failure of ciphers, wars have been fought, battles decided, kingdoms won, and heads lost. In the information age, ciphers and cryptosystems have become part of everyday life; we use them to protect our computers, to shop over the Internet, and to access our money via an ATM (automated teller machine). One of the oldest and simplest of all ciphers is the transposition or Caesarean cipher (attributed to Julius Caesar), in which the letters are shifted by a known (and secret) number of places in the alphabet. If the shift is 1, for example, then A is enciphered as B, B→C, · · ·, Y→Z, Z→A. A shift of five places leads us to make the replacements A→F, B→G, · · ·, Y→D, Z→E.
Less
The practical implementation of quantum information technologies requires, for the most part, highly advanced and currently experimental procedures. One exception is quantum cryptography, or quantum key distribution, which has been successfully demonstrated in many laboratories and has reached an advanced level of development. It will probably become the first commercial application of quantum information. In quantum key distribution, Alice and Bob exploit a quantum channel to create a secret shared key comprising a random string of binary digits. This key can then be used to protect a subsequent communication between them. The principal idea is that the secrecy of the key distribution is ensured by the laws of quantum physics. Proving security for practical communication systems is a challenging problem and requires techniques that are beyond the scope of this book. At a fundamental level, however, the ideas are simple and may readily be understood with the knowledge we have already acquired. Quantum cryptography is the latest idea in the long history of secure (and not so secure) communications and, if it is to develop, it will have to compete with existing technologies. For this reason we begin with a brief survey of the history and current state of the art in secure communications before turning to the possibilities offered by quantum communications. The history of cryptography is a long and fascinating one. As a consequence of the success or, more spectacularly, the failure of ciphers, wars have been fought, battles decided, kingdoms won, and heads lost. In the information age, ciphers and cryptosystems have become part of everyday life; we use them to protect our computers, to shop over the Internet, and to access our money via an ATM (automated teller machine). One of the oldest and simplest of all ciphers is the transposition or Caesarean cipher (attributed to Julius Caesar), in which the letters are shifted by a known (and secret) number of places in the alphabet. If the shift is 1, for example, then A is enciphered as B, B→C, · · ·, Y→Z, Z→A. A shift of five places leads us to make the replacements A→F, B→G, · · ·, Y→D, Z→E.