D. A. Bini, G. Latouche, and B. Meini
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198527688
- eISBN:
- 9780191713286
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198527688.003.0005
- Subject:
- Mathematics, Numerical Analysis
In this chapter a series of processes with a variety of transition structures are considered and their analysis is presented in a unifying manner. These processes are grouped under the generic name ...
More
In this chapter a series of processes with a variety of transition structures are considered and their analysis is presented in a unifying manner. These processes are grouped under the generic name of Phase-type queues, they include G/M/1-type Markov chains, QBD processes, Markov chains with Toeplitz-like transitions and limited displacements (non-skip-free), and tree-like processes. A duality property between M/G/1 and G/M/1 Markov chains is described and a reduction of M/G/1 and G/M/1 Markov chains to QBD is analysed.Less
In this chapter a series of processes with a variety of transition structures are considered and their analysis is presented in a unifying manner. These processes are grouped under the generic name of Phase-type queues, they include G/M/1-type Markov chains, QBD processes, Markov chains with Toeplitz-like transitions and limited displacements (non-skip-free), and tree-like processes. A duality property between M/G/1 and G/M/1 Markov chains is described and a reduction of M/G/1 and G/M/1 Markov chains to QBD is analysed.
Lyn C. Thomas
- Published in print:
- 2009
- Published Online:
- May 2009
- ISBN:
- 9780199232130
- eISBN:
- 9780191715914
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199232130.003.0004
- Subject:
- Mathematics, Applied Mathematics, Mathematical Finance
This chapter begins by reviewing the role of behavioural scoring and risk/reward matrices in the way a lender manages borrowers. It points out that current methods do not allow for the future changes ...
More
This chapter begins by reviewing the role of behavioural scoring and risk/reward matrices in the way a lender manages borrowers. It points out that current methods do not allow for the future changes in customer behaviour nor do they optimize the expected profit. It then shows how two approaches — Markov chain models and survival analysis ideas — can lead to dynamic profitability based models. In particular, the extension of Markov chains to Markov decision processes allows one to optimize decisions such as how to adjust the credit limit. In survival analysis, the proportional hazard models lead to hazard scores which play the same role as credit scores but work on all future performance periods. Moreover, the ideas of competing risk in survival analysis allows one to build profitability models that allow for default, prepayment, and attrition.Less
This chapter begins by reviewing the role of behavioural scoring and risk/reward matrices in the way a lender manages borrowers. It points out that current methods do not allow for the future changes in customer behaviour nor do they optimize the expected profit. It then shows how two approaches — Markov chain models and survival analysis ideas — can lead to dynamic profitability based models. In particular, the extension of Markov chains to Markov decision processes allows one to optimize decisions such as how to adjust the credit limit. In survival analysis, the proportional hazard models lead to hazard scores which play the same role as credit scores but work on all future performance periods. Moreover, the ideas of competing risk in survival analysis allows one to build profitability models that allow for default, prepayment, and attrition.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0004
- Subject:
- Mathematics, Probability / Statistics
This chapter deals with Markov processes. It first defines the “Markov property” and shows that all the relevant information about a Markov process assuming values in a finite set of cardinality n ...
More
This chapter deals with Markov processes. It first defines the “Markov property” and shows that all the relevant information about a Markov process assuming values in a finite set of cardinality n can be captured by a nonnegative n x n matrix known as the state transition matrix, and an n-dimensional probability distribution of the initial state. It then invokes the results of the previous chapter on nonnegative matrices to analyze the temporal evolution of Markov processes. It also estimates the state transition matrix and considers the dynamics of stationary Markov chains, recurrent and transient states, hitting probability and mean hitting times, and the ergodicity of Markov chains.Less
This chapter deals with Markov processes. It first defines the “Markov property” and shows that all the relevant information about a Markov process assuming values in a finite set of cardinality n can be captured by a nonnegative n x n matrix known as the state transition matrix, and an n-dimensional probability distribution of the initial state. It then invokes the results of the previous chapter on nonnegative matrices to analyze the temporal evolution of Markov processes. It also estimates the state transition matrix and considers the dynamics of stationary Markov chains, recurrent and transient states, hitting probability and mean hitting times, and the ergodicity of Markov chains.
Alan Frieze and Eric Vigoda
- Published in print:
- 2007
- Published Online:
- September 2007
- ISBN:
- 9780198571278
- eISBN:
- 9780191718885
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198571278.003.0004
- Subject:
- Mathematics, Probability / Statistics
In recent years, considerable progress has been made on the analysis of Markov chains for generating a random colouring of an input graph. These improvements have come in conjunction with refinements ...
More
In recent years, considerable progress has been made on the analysis of Markov chains for generating a random colouring of an input graph. These improvements have come in conjunction with refinements of the coupling method, which is a classical tool in probability theory. This chapter surveys results on generating random colourings and related technical improvements. It focuses on Markov Chain Monte Carlo (MCMC) algorithms for approximately counting the number of k-colourings of a graph.Less
In recent years, considerable progress has been made on the analysis of Markov chains for generating a random colouring of an input graph. These improvements have come in conjunction with refinements of the coupling method, which is a classical tool in probability theory. This chapter surveys results on generating random colourings and related technical improvements. It focuses on Markov Chain Monte Carlo (MCMC) algorithms for approximately counting the number of k-colourings of a graph.
Jeffrey R. Groff, Hilary DeRemigio, and Gregory D. Smith
- Published in print:
- 2009
- Published Online:
- February 2010
- ISBN:
- 9780199235070
- eISBN:
- 9780191715778
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199235070.003.0002
- Subject:
- Mathematics, Biostatistics
This chapter is an introduction to modelling stochastically gating ion channels using continuous-time discrete-state Markov chains. Analytical and numerical methods are presented for determining ...
More
This chapter is an introduction to modelling stochastically gating ion channels using continuous-time discrete-state Markov chains. Analytical and numerical methods are presented for determining steady-state statistics of single channel gating, including the stationary distribution and open and closed dwell times. Model reduction techniques such as fast–slow analysis and state lumping are discussed as well as Gillespie’s method for simulating stochastically gating ion channels. Techniques for the estimation of model parameters and identification of model topology are briefly discussed, as well as the thermodynamic requirements that constrain the selection of rate constants. Approaches for modelling clusters of interacting ion channels using Markov chains are also summarized. Our presentation is restricted to Markov chain models of intracellular calcium release sites where clusters of calcium release channels are coupled via changes in the local calcium concentration and exhibit stochastic calcium excitability reminiscent of calcium puffs and sparks. Representative release site simulations are presented showing how phenomena such as allosteric coupling and calcium-dependent inactivation, in addition to calcium-dependent activation, affect the generation and termination of calcium puffs and sparks. The chapter concludes by considering the state space explosion that occurs as more channels are included in Markov chain models of calcium release sites. Techniques used to mitigate against this state space explosion are discussed, including the use of Kronecker representations and mean-field approximations.Less
This chapter is an introduction to modelling stochastically gating ion channels using continuous-time discrete-state Markov chains. Analytical and numerical methods are presented for determining steady-state statistics of single channel gating, including the stationary distribution and open and closed dwell times. Model reduction techniques such as fast–slow analysis and state lumping are discussed as well as Gillespie’s method for simulating stochastically gating ion channels. Techniques for the estimation of model parameters and identification of model topology are briefly discussed, as well as the thermodynamic requirements that constrain the selection of rate constants. Approaches for modelling clusters of interacting ion channels using Markov chains are also summarized. Our presentation is restricted to Markov chain models of intracellular calcium release sites where clusters of calcium release channels are coupled via changes in the local calcium concentration and exhibit stochastic calcium excitability reminiscent of calcium puffs and sparks. Representative release site simulations are presented showing how phenomena such as allosteric coupling and calcium-dependent inactivation, in addition to calcium-dependent activation, affect the generation and termination of calcium puffs and sparks. The chapter concludes by considering the state space explosion that occurs as more channels are included in Markov chain models of calcium release sites. Techniques used to mitigate against this state space explosion are discussed, including the use of Kronecker representations and mean-field approximations.
D. A. Bini, G. Latouche, and B. Meini
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198527688
- eISBN:
- 9780191713286
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198527688.003.0007
- Subject:
- Mathematics, Numerical Analysis
This chapter is concerned with quadratically convergent algorithms for solving nonlinear matrix equations encountered in M/G/1, G/M/1, and QBD processes. The algorithm of logarithmic reduction for ...
More
This chapter is concerned with quadratically convergent algorithms for solving nonlinear matrix equations encountered in M/G/1, G/M/1, and QBD processes. The algorithm of logarithmic reduction for solving QBDs is described and analysed, and an algebraic proof of its convergence is provided. Then the cyclic reduction technique for QBDs is explained and related to logarithmic reduction. Cyclic reduction is extended to M/G/1 and G/M/1 processes, convergence and applicability results are proved. Computational issues concerning the implementation of cyclic reduction are treated by relying on the tools introduced in Chapters 2 and 3.Less
This chapter is concerned with quadratically convergent algorithms for solving nonlinear matrix equations encountered in M/G/1, G/M/1, and QBD processes. The algorithm of logarithmic reduction for solving QBDs is described and analysed, and an algebraic proof of its convergence is provided. Then the cyclic reduction technique for QBDs is explained and related to logarithmic reduction. Cyclic reduction is extended to M/G/1 and G/M/1 processes, convergence and applicability results are proved. Computational issues concerning the implementation of cyclic reduction are treated by relying on the tools introduced in Chapters 2 and 3.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0006
- Subject:
- Mathematics, Probability / Statistics
This chapter considers the basic properties of hidden Markov processes (HMPs) or hidden Markov models (HMMs), a special type of stochastic process. It begins with a discussion of three distinct types ...
More
This chapter considers the basic properties of hidden Markov processes (HMPs) or hidden Markov models (HMMs), a special type of stochastic process. It begins with a discussion of three distinct types of HMMs and shows that they are all equivalent from the standpoint of their expressive power or modeling ability: Type 1 hidden Markov model, or a HMM of the deterministic function of a Markov chain type; hidden Markov model of Type 2, or a HMM of the random function of a Markov chain type; and hidden Markov model of Type 3, or a HMM of the joint Markov process type. The chapter also examines various issues related to the computation of likelihoods in a HMM before concluding with an overview of the Viterbi algorithm and the Baum–Welch algorithm.Less
This chapter considers the basic properties of hidden Markov processes (HMPs) or hidden Markov models (HMMs), a special type of stochastic process. It begins with a discussion of three distinct types of HMMs and shows that they are all equivalent from the standpoint of their expressive power or modeling ability: Type 1 hidden Markov model, or a HMM of the deterministic function of a Markov chain type; hidden Markov model of Type 2, or a HMM of the random function of a Markov chain type; and hidden Markov model of Type 3, or a HMM of the joint Markov process type. The chapter also examines various issues related to the computation of likelihoods in a HMM before concluding with an overview of the Viterbi algorithm and the Baum–Welch algorithm.
D. A. Bini, G. Latouche, and B. Meini
- Published in print:
- 2005
- Published Online:
- September 2007
- ISBN:
- 9780198527688
- eISBN:
- 9780191713286
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198527688.003.0009
- Subject:
- Mathematics, Numerical Analysis
Some specialized structures are investigated in this chapter and some of the algorithms in previous chapters are adapted to the specific cases. Markov chains with limited displacement (non-skip-free ...
More
Some specialized structures are investigated in this chapter and some of the algorithms in previous chapters are adapted to the specific cases. Markov chains with limited displacement (non-skip-free processes) are solved by means of functional iterations and cyclic reduction. Markov chains of M/G/1-type are reduced to a special QBD process with infinite blocks and treated with cyclic reduction. Finally, three different algorithms for tree-like stochastic processes, relying on fixed point iterations, Newton’s iteration, and cyclic reduction, are introduced and analysed.Less
Some specialized structures are investigated in this chapter and some of the algorithms in previous chapters are adapted to the specific cases. Markov chains with limited displacement (non-skip-free processes) are solved by means of functional iterations and cyclic reduction. Markov chains of M/G/1-type are reduced to a special QBD process with infinite blocks and treated with cyclic reduction. Finally, three different algorithms for tree-like stochastic processes, relying on fixed point iterations, Newton’s iteration, and cyclic reduction, are introduced and analysed.
N. Thompson Hobbs and Mevin B. Hooten
- Published in print:
- 2015
- Published Online:
- October 2017
- ISBN:
- 9780691159287
- eISBN:
- 9781400866557
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691159287.003.0007
- Subject:
- Biology, Ecology
This chapter explains how to implement Bayesian analyses using the Markov chain Monte Carlo (MCMC) algorithm, a set of methods for Bayesian analysis made popular by the seminal paper of Gelfand and ...
More
This chapter explains how to implement Bayesian analyses using the Markov chain Monte Carlo (MCMC) algorithm, a set of methods for Bayesian analysis made popular by the seminal paper of Gelfand and Smith (1990). It begins with an explanation of MCMC with a heuristic, high-level treatment of the algorithm, describing its operation in simple terms with a minimum of formalism. In this first part, the chapter explains the algorithm so that all readers can gain an intuitive understanding of how to find the posterior distribution by sampling from it. Next, the chapter offers a somewhat more formal treatment of how MCMC is implemented mathematically. Finally, this chapter discusses implementation of Bayesian models via two routes—by using software and by writing one's own algorithm.Less
This chapter explains how to implement Bayesian analyses using the Markov chain Monte Carlo (MCMC) algorithm, a set of methods for Bayesian analysis made popular by the seminal paper of Gelfand and Smith (1990). It begins with an explanation of MCMC with a heuristic, high-level treatment of the algorithm, describing its operation in simple terms with a minimum of formalism. In this first part, the chapter explains the algorithm so that all readers can gain an intuitive understanding of how to find the posterior distribution by sampling from it. Next, the chapter offers a somewhat more formal treatment of how MCMC is implemented mathematically. Finally, this chapter discusses implementation of Bayesian models via two routes—by using software and by writing one's own algorithm.
M. Vidyasagar
- Published in print:
- 2014
- Published Online:
- October 2017
- ISBN:
- 9780691133157
- eISBN:
- 9781400850518
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691133157.003.0005
- Subject:
- Mathematics, Probability / Statistics
This chapter provides an introduction to large deviation theory. It begins with an overview of the motivatio
n for the problem under study, focusing on probability distributions and how to construct ...
More
This chapter provides an introduction to large deviation theory. It begins with an overview of the motivatio
n for the problem under study, focusing on probability distributions and how to construct an empirical distribution. It then considers the notion of a lower semi-continuous function and that of a lower semi-continuous relaxation before discussing the large deviation property for i.i.d. samples. In particular, it describes Sanov's theorem for a finite alphabet and proceeds by analyzing large deviation property for Markov chains, taking into account stationary distributions, entropy and relative entropy rates, the rate function for doubleton frequencies, and the rate function for singleton frequencies.Less
This chapter provides an introduction to large deviation theory. It begins with an overview of the motivatio
n for the problem under study, focusing on probability distributions and how to construct an empirical distribution. It then considers the notion of a lower semi-continuous function and that of a lower semi-continuous relaxation before discussing the large deviation property for i.i.d. samples. In particular, it describes Sanov's theorem for a finite alphabet and proceeds by analyzing large deviation property for Markov chains, taking into account stationary distributions, entropy and relative entropy rates, the rate function for doubleton frequencies, and the rate function for singleton frequencies.
Brian Skyrms
- Published in print:
- 2012
- Published Online:
- January 2013
- ISBN:
- 9780199652808
- eISBN:
- 9780191745829
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199652808.003.0014
- Subject:
- Philosophy, Philosophy of Science, Metaphysics/Epistemology
In his Nachlass, Carnap listed outstanding problems for the development of inductive logic. One was the treatment of what he called “analogy by proximity”. This can be handled naturally along ...
More
In his Nachlass, Carnap listed outstanding problems for the development of inductive logic. One was the treatment of what he called “analogy by proximity”. This can be handled naturally along Carnapian lines.Less
In his Nachlass, Carnap listed outstanding problems for the development of inductive logic. One was the treatment of what he called “analogy by proximity”. This can be handled naturally along Carnapian lines.
Jesper Møller
- Published in print:
- 2009
- Published Online:
- February 2010
- ISBN:
- 9780199232574
- eISBN:
- 9780191716393
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199232574.003.0009
- Subject:
- Mathematics, Geometry / Topology
This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based ...
More
This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based on a maximum likelihood or Bayesian approach combined with Markov chain Monte Carlo (MCMC) techniques. Due to space limitations the focus is on spatial point processes.Less
This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based on a maximum likelihood or Bayesian approach combined with Markov chain Monte Carlo (MCMC) techniques. Due to space limitations the focus is on spatial point processes.
Jonathan Bendor, Daniel Diermeier, David A. Siegel, and Michael M. Ting
- Published in print:
- 2011
- Published Online:
- October 2017
- ISBN:
- 9780691135076
- eISBN:
- 9781400836802
- Item type:
- chapter
- Publisher:
- Princeton University Press
- DOI:
- 10.23943/princeton/9780691135076.003.0002
- Subject:
- Political Science, American Politics
This chapter discusses some general properties of aspiration-based adaptive rules (ABARs). It begins with an overview of propensity and aspiration-based adjustment, using axioms to represent three ...
More
This chapter discusses some general properties of aspiration-based adaptive rules (ABARs). It begins with an overview of propensity and aspiration-based adjustment, using axioms to represent three premises: agents have aspirations, they compare payoffs to aspirations, and these comparisons determine the key qualitative properties of how agents adjust their action propensities. It then considers stochastic processes such as Markov chains before turning to some useful types of ABARs, along with realistic aspirations and how the behavior and performance of ABARs are associated with the existence of relations of Pareto dominance among alternatives. It also examines the empirical content of ABAR-driven models and concludes with an analysis of some evidence regarding aspiration-based adaptation, paying attention to behavior and hedonics.Less
This chapter discusses some general properties of aspiration-based adaptive rules (ABARs). It begins with an overview of propensity and aspiration-based adjustment, using axioms to represent three premises: agents have aspirations, they compare payoffs to aspirations, and these comparisons determine the key qualitative properties of how agents adjust their action propensities. It then considers stochastic processes such as Markov chains before turning to some useful types of ABARs, along with realistic aspirations and how the behavior and performance of ABARs are associated with the existence of relations of Pareto dominance among alternatives. It also examines the empirical content of ABAR-driven models and concludes with an analysis of some evidence regarding aspiration-based adaptation, paying attention to behavior and hedonics.
Cristopher Moore and Stephan Mertens
- Published in print:
- 2011
- Published Online:
- December 2013
- ISBN:
- 9780199233212
- eISBN:
- 9780191775079
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199233212.003.0012
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
Random sampling is a technique for dealing with possible states or solutions having an exponentially large space. The best method of random sampling generally involves a random walk or a Markov ...
More
Random sampling is a technique for dealing with possible states or solutions having an exponentially large space. The best method of random sampling generally involves a random walk or a Markov chain. A Markov chain requires a number of steps to approach equilibrium, and thus provide a good random sample of the state space. This number of steps is called mixing time, which can be calculated by thinking about how quickly its choices overwhelm the system’s memory of its initial state, the extent to which one part of a system influences another, and how smoothly probability flows from one part of the state space to another. This chapter explores random walks and rapid mixing, first by considering a classic example from physics: a block of iron. It then discusses transition matrices, ergodicity, coupling, spectral gap, and expanders, as well as the role of conductance and the spectral gap in rapid mixing. It concludes by showing that temporal mixing is closely associated with spatial mixing.Less
Random sampling is a technique for dealing with possible states or solutions having an exponentially large space. The best method of random sampling generally involves a random walk or a Markov chain. A Markov chain requires a number of steps to approach equilibrium, and thus provide a good random sample of the state space. This number of steps is called mixing time, which can be calculated by thinking about how quickly its choices overwhelm the system’s memory of its initial state, the extent to which one part of a system influences another, and how smoothly probability flows from one part of the state space to another. This chapter explores random walks and rapid mixing, first by considering a classic example from physics: a block of iron. It then discusses transition matrices, ergodicity, coupling, spectral gap, and expanders, as well as the role of conductance and the spectral gap in rapid mixing. It concludes by showing that temporal mixing is closely associated with spatial mixing.
Eric Renshaw
- Published in print:
- 2011
- Published Online:
- September 2011
- ISBN:
- 9780199575312
- eISBN:
- 9780191728778
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199575312.003.0005
- Subject:
- Mathematics, Applied Mathematics, Mathematical Biology
This chapter considers Markov chains and branching processes. It covers two-state Markov chain, examples of m-state Markov chains, the Ehrenfest model, and branching processes.
This chapter considers Markov chains and branching processes. It covers two-state Markov chain, examples of m-state Markov chains, the Ehrenfest model, and branching processes.
ZIHENG YANG
- Published in print:
- 2006
- Published Online:
- April 2010
- ISBN:
- 9780198567028
- eISBN:
- 9780191728280
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567028.003.0001
- Subject:
- Biology, Evolutionary Biology / Genetics
This chapter discusses models of nucleotide substitution and calculation of the distance between a pair of sequences. It introduces the theory of Markov chains and the maximum likelihood method, ...
More
This chapter discusses models of nucleotide substitution and calculation of the distance between a pair of sequences. It introduces the theory of Markov chains and the maximum likelihood method, which are used extensively later in the book. Exercises are provided at the end of the chapter.Less
This chapter discusses models of nucleotide substitution and calculation of the distance between a pair of sequences. It introduces the theory of Markov chains and the maximum likelihood method, which are used extensively later in the book. Exercises are provided at the end of the chapter.
ZIHENG YANG
- Published in print:
- 2006
- Published Online:
- April 2010
- ISBN:
- 9780198567028
- eISBN:
- 9780191728280
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198567028.003.0005
- Subject:
- Biology, Evolutionary Biology / Genetics
This chapter provides a brief introduction to the theory and computation of Bayesian statistics and its applications to molecular evolution. It uses simple examples, such as distance estimation under ...
More
This chapter provides a brief introduction to the theory and computation of Bayesian statistics and its applications to molecular evolution. It uses simple examples, such as distance estimation under the JC69 model, to introduce the general principles. It discusses the application of Bayesian inference to reconstruction of phylogenetic trees and to population genetics analysis under the coalescent. Exercises are provided at the end of the chapter.Less
This chapter provides a brief introduction to the theory and computation of Bayesian statistics and its applications to molecular evolution. It uses simple examples, such as distance estimation under the JC69 model, to introduce the general principles. It discusses the application of Bayesian inference to reconstruction of phylogenetic trees and to population genetics analysis under the coalescent. Exercises are provided at the end of the chapter.
Ziheng Yang
- Published in print:
- 2014
- Published Online:
- August 2014
- ISBN:
- 9780199602605
- eISBN:
- 9780191782251
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199602605.003.0007
- Subject:
- Biology, Biomathematics / Statistics and Data Analysis / Complexity Studies, Evolutionary Biology / Genetics
This chapter provides a detailed introduction to modern Bayesian computation. The Metropolis–Hastings algorithm is illustrated using a simple example of distance estimation between two sequences. A ...
More
This chapter provides a detailed introduction to modern Bayesian computation. The Metropolis–Hastings algorithm is illustrated using a simple example of distance estimation between two sequences. A number of generic Markov chain Monte Carlo (MCMC) proposal moves are described, and the calculation of their proposal ratios is illustrated. The chapter discusses the convergence rate of the Markov chain as well as its mixing efficiency, as influenced by the MCMC proposal. The chapter also illustrates several advanced MCMC algorithms, including parallel tempering (Metropolis-coupled MCMC or MCMCMC) which uses heated chains to improve mixing when there are multiple local peaks on the posterior surface, reversible jump MCMC (rjMCMC) which is used in trans-model and trans-dimensional inference, and calculation of the Bayes factor used in Bayesian model selection.Less
This chapter provides a detailed introduction to modern Bayesian computation. The Metropolis–Hastings algorithm is illustrated using a simple example of distance estimation between two sequences. A number of generic Markov chain Monte Carlo (MCMC) proposal moves are described, and the calculation of their proposal ratios is illustrated. The chapter discusses the convergence rate of the Markov chain as well as its mixing efficiency, as influenced by the MCMC proposal. The chapter also illustrates several advanced MCMC algorithms, including parallel tempering (Metropolis-coupled MCMC or MCMCMC) which uses heated chains to improve mixing when there are multiple local peaks on the posterior surface, reversible jump MCMC (rjMCMC) which is used in trans-model and trans-dimensional inference, and calculation of the Bayes factor used in Bayesian model selection.
Pablo A. Goloboff and Diego Pol
- Published in print:
- 2006
- Published Online:
- September 2007
- ISBN:
- 9780199297306
- eISBN:
- 9780191713729
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199297306.003.0008
- Subject:
- Biology, Evolutionary Biology / Genetics
The intent of a statistically-based phylogenetic method is to estimate tree topologies and values of possibly relevant parameters, as well as the uncertainty inherent in those estimations. A method ...
More
The intent of a statistically-based phylogenetic method is to estimate tree topologies and values of possibly relevant parameters, as well as the uncertainty inherent in those estimations. A method that could do that with reasonable accuracy would be attractive indeed. It is often claimed that it is advantageous for a method to be based on a specific evolutionary model, because that allows incorporating into the analysis the ‘knowledge’ of the real world embodied in the model. Bayesian methods have become very prominent among model-based methods, in part because of computational advantages, and in part because they estimate the probability that a given hypothesis is true, given the observations and model assumptions. Through simulation studies, this chapter finds that even if there is the potential for Bayesian estimations of monophyly to provide correct topological estimations for infinite numbers of characters, the resulting claims to measure degrees of support for conclusions — in a statistical sense — are unfounded. Additionally, for large numbers of terminals, it is argued to be extremely unlikely that a search via Markov Monte Carlo techniques would ever pass through the optimal tree(s), let alone pass through the optimal tree(s) enough times to estimate their posterior probability with any accuracy.Less
The intent of a statistically-based phylogenetic method is to estimate tree topologies and values of possibly relevant parameters, as well as the uncertainty inherent in those estimations. A method that could do that with reasonable accuracy would be attractive indeed. It is often claimed that it is advantageous for a method to be based on a specific evolutionary model, because that allows incorporating into the analysis the ‘knowledge’ of the real world embodied in the model. Bayesian methods have become very prominent among model-based methods, in part because of computational advantages, and in part because they estimate the probability that a given hypothesis is true, given the observations and model assumptions. Through simulation studies, this chapter finds that even if there is the potential for Bayesian estimations of monophyly to provide correct topological estimations for infinite numbers of characters, the resulting claims to measure degrees of support for conclusions — in a statistical sense — are unfounded. Additionally, for large numbers of terminals, it is argued to be extremely unlikely that a search via Markov Monte Carlo techniques would ever pass through the optimal tree(s), let alone pass through the optimal tree(s) enough times to estimate their posterior probability with any accuracy.
Marc Mézard and Andrea Montanari
- Published in print:
- 2009
- Published Online:
- September 2009
- ISBN:
- 9780198570837
- eISBN:
- 9780191718755
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780198570837.003.0004
- Subject:
- Physics, Theoretical, Computational, and Statistical Physics
The three fields that form the subject of this book all deal with large sets of random variables. Not surprisingly, they possess common underlying structures and techniques. This chapter describes ...
More
The three fields that form the subject of this book all deal with large sets of random variables. Not surprisingly, they possess common underlying structures and techniques. This chapter describes some of them, insisting on the mathematical structures. It discusses on one hand large deviations, Sanov's theorem, and asymptotic equipartition. On the other hand, it introduces Markov chains for Monte Carlo computations, and its application to optimization with simulated annealing.Less
The three fields that form the subject of this book all deal with large sets of random variables. Not surprisingly, they possess common underlying structures and techniques. This chapter describes some of them, insisting on the mathematical structures. It discusses on one hand large deviations, Sanov's theorem, and asymptotic equipartition. On the other hand, it introduces Markov chains for Monte Carlo computations, and its application to optimization with simulated annealing.