Bültmann & Gerriets
Tools for Statistical Inference
Methods for the Exploration of Posterior Distributions and Likelihood Functions
von Martin A. Tanner
Verlag: Springer New York
Reihe: Springer Series in Statistics
Reihe: Springer Texts in Statistics
E-Book / PDF
Kopierschutz: PDF mit Wasserzeichen

Hinweis: Nach dem Checkout (Kasse) wird direkt ein Link zum Download bereitgestellt. Der Link kann dann auf PC, Smartphone oder E-Book-Reader ausgeführt werden.
E-Books können per PayPal bezahlt werden. Wenn Sie E-Books per Rechnung bezahlen möchten, kontaktieren Sie uns bitte.

ISBN: 978-1-4612-4024-2
Auflage: 3rd ed. 1996
Erschienen am 06.12.2012
Sprache: Englisch
Umfang: 208 Seiten

Preis: 117,69 €

Inhaltsverzeichnis
Klappentext

1. Introduction.- Exercises.- 2. Normal Approximations to Likelihoods and to Posteriors.- 2.1. Likelihood/Posterior Density.- 2.2. Specification of the Prior.- 2.3. Maximum Likelihood.- 2.4. Normal-Based Inference.- 2.5. The ?-Method (Propagation of Errors).- 2.6. Highest Posterior Density Regions.- Exercises.- 3. Nonnormal Approximations to Likelihoods and Posteriors.- 3.1. Numerical Integration.- 3.2. Posterior Moments and Marginalization Based on Laplace's Method.- 3.3. Monte Carlo Methods.- Exercises.- 4. The EM Algorithm.- 4.1. Introduction.- 4.2. Theory.- 4.3. EM in the Exponential Family.- 4.4. Standard Errors in the Context of EM.- 4.5. Monte Carlo Implementation of the E-Step.- 4.6. Acceleration of EM (Louis' Turbo EM).- 4.7. Facilitating the M-Step.- Exercises.- 5. The Data Augmentation Algorithm.- 5.1. Introduction and Motivation.- 5.2. Computing and Sampling from the Predictive Distribution.- 5.3. Calculating the Content and Boundary of the HPD Region.- 5.4. Remarks on the General Implementation of the Data Augmentation Algorithm.- 5.5. Overview of the Convergence Theory of Data Augmentation.- 5.6. Poor Man's Data Augmentation Algorithms.- 5.7. Sampling/Importance Resampling (SIR).- 5.8. General Imputation Methods.- 5.9. Further Importance Sampling Ideas.- 5.10. Sampling in the Context of Multinomial Data.- Exercises.- 6. Markov Chain Monte Carlo: The Gibbs Sampler and the Metropolis Algorithm.- 6.1. Introduction to the Gibbs Sampler.- 6.2. Examples.- 6.3. Assessing Convergence of the Chain.- 6.4. The Griddy Gibbs Sampler.- 6.5. The Metropolis Algorithm.- 6.6. Conditional Inference via the Gibbs Sampler.- Exercises.- References.



A unified introduction to a variety of computational algorithms for likelihood and Bayesian inference. This third edition expands the discussion of many of the techniques presented, and includes additional examples as well as exercise sets at the end of each chapter.


andere Formate
weitere Titel der Reihe