J. Durbin and S.J. Koopman
- Published in print:
- 2012
- Published Online:
- December 2013
- ISBN:
- 9780199641178
- eISBN:
- 9780191774881
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/acprof:oso/9780199641178.003.0009
- Subject:
- Mathematics, Probability / Statistics
This chapter discusses the range of non-Gaussian and nonlinear models that can be analysed using the methods of Part II. The chapter is organized as follows. Section 9.2 considers an important ...
More
This chapter discusses the range of non-Gaussian and nonlinear models that can be analysed using the methods of Part II. The chapter is organized as follows. Section 9.2 considers an important special form of the general linear non-Gaussian model. Sections 9.3–9.6 examine special cases of some subclasses of models of interest, namely exponential family models, heavy-tailed models, stochastic volatility model and other financial models. Section 9.7 describes some classes of nonlinear models of interest.Less
This chapter discusses the range of non-Gaussian and nonlinear models that can be analysed using the methods of Part II. The chapter is organized as follows. Section 9.2 considers an important special form of the general linear non-Gaussian model. Sections 9.3–9.6 examine special cases of some subclasses of models of interest, namely exponential family models, heavy-tailed models, stochastic volatility model and other financial models. Section 9.7 describes some classes of nonlinear models of interest.
Russell Cheng
- Published in print:
- 2017
- Published Online:
- September 2017
- ISBN:
- 9780198505044
- eISBN:
- 9780191746390
- Item type:
- chapter
- Publisher:
- Oxford University Press
- DOI:
- 10.1093/oso/9780198505044.003.0003
- Subject:
- Mathematics, Probability / Statistics
This book relies on maximum likelihood (ML) estimation of parameters. Asymptotic theory assumes regularity conditions hold when the ML estimator is consistent. Typically an additional third ...
More
This book relies on maximum likelihood (ML) estimation of parameters. Asymptotic theory assumes regularity conditions hold when the ML estimator is consistent. Typically an additional third derivative condition is assumed to ensure that the ML estimator is also asymptotically normally distributed. Standard asymptotic results that then hold are summarized in this chapter; for example, the asymptotic variance of the ML estimator is then given by the Fisher information formula, and the log-likelihood ratio, the Wald and the score statistics for testing the statistical significance of parameter estimates are all asymptotically equivalent. Also, the useful profile log-likelihood then behaves exactly as a standard log-likelihood only in a parameter space of just one dimension. Further, the model can be reparametrized to make it locally orthogonal in the neighbourhood of the true parameter value. The large exponential family of models is briefly reviewed where a unified set of regular conditions can be obtained.Less
This book relies on maximum likelihood (ML) estimation of parameters. Asymptotic theory assumes regularity conditions hold when the ML estimator is consistent. Typically an additional third derivative condition is assumed to ensure that the ML estimator is also asymptotically normally distributed. Standard asymptotic results that then hold are summarized in this chapter; for example, the asymptotic variance of the ML estimator is then given by the Fisher information formula, and the log-likelihood ratio, the Wald and the score statistics for testing the statistical significance of parameter estimates are all asymptotically equivalent. Also, the useful profile log-likelihood then behaves exactly as a standard log-likelihood only in a parameter space of just one dimension. Further, the model can be reparametrized to make it locally orthogonal in the neighbourhood of the true parameter value. The large exponential family of models is briefly reviewed where a unified set of regular conditions can be obtained.