Bültmann & Gerriets
Tools for Statistical Inference
Methods for the Exploration of Posterior Distributions and Likelihood Functions
von Martin A. Tanner
Verlag: Springer New York
Reihe: Springer Series in Statistics
Reihe: Springer Texts in Statistics
E-Book / PDF
Kopierschutz: PDF mit Wasserzeichen

Hinweis: Nach dem Checkout (Kasse) wird direkt ein Link zum Download bereitgestellt. Der Link kann dann auf PC, Smartphone oder E-Book-Reader ausgeführt werden.
E-Books können per PayPal bezahlt werden. Wenn Sie E-Books per Rechnung bezahlen möchten, kontaktieren Sie uns bitte.

ISBN: 978-1-4684-0192-9
Auflage: 2nd ed. 1993
Erschienen am 06.12.2012
Sprache: Englisch

Preis: 82,38 €

82,38 €
merken
Inhaltsverzeichnis
Klappentext

1 Introduction.- 2 Normal Approximations to Likelihoods and to Posteriors.- 2.1 Likelihood/Posterior Density.- 2.2 Maximum Likelihood.- 2.2.1 Newton-Raphson.- 2.2.2 Examples.- 2.3 Normal-Based Inference.- 2.4 The ?-Method (Propagation of Errors).- 2.5 Highest Posterior Density Regions.- 3 Nonnormal Approximations to Likelihoods and to Posteriors.- 3.1 Conjugate Priors and Numerical Integration.- 3.2 Posterior Moments and Marginalization Based on Laplace's Method.- 3.2.1 Moments.- 3.2.2 Marginalization.- 3.3 Monte Carlo Methods.- 3.3.1 Monte Carlo.- 3.3.2 Composition.- 3.3.3 Importance Sampling and Rejection/Acceptance.- 4 The EM Algorithm.- 4.1 Introduction.- 4.2 Theory.- 4.3 EM in the Exponential Family.- 4.4 Standard Errors in the Context of EM.- 4.4.1 Direct Computation/Numerical Differentiation.- 4.4.2 Missing Information Principle.- 4.4.3 Louis' Method.- 4.4.4 Simulation.- 4.4.5 Using EM Iterates.- 4.5 Monte Carlo Implementation of the E-Step.- 4.6 Acceleration of EM (Louis' Turbo EM).- 5 The Data Augmentation Algorithm.- 5.1 Introduction and Motivation.- 5.2 Computing and Sampling from the Predictive Distribution.- 5.3 Calculating the Content and Boundary of the HPD Region.- 5.3.1 Calculating the Content.- 5.3.2 Calculating the Boundary.- 5.4 Remarks on the General Implementation of the Data Augmentation Algorithm.- 5.5 Overview of the Convergence Theory of Data Augmentation.- 5.6 Poor Man's Data Augmentation Algorithms.- 5.6.1 PMDA 1.- 5.6.2 PMDA Exact.- 5.6.3 PMDA 2.- 5.7 Sampling/Importance Resampling (SIR).- 5.8 General Imputation Methods.- 5.8.1 Introduction.- 5.8.2 Hot Deck Imputation.- 5.8.3 Simple Residual Imputation.- 5.8.4 Normal and Adjusted Normal Imputation.- 5.8.5 Nonignorable Nonresponse.- 5.8.5.1 Mixture Model-Without Followup Data.- 5.8.5.2 Mixture Model-With Followup Data.- 5.8.5.3 Selection Model-Without Followup Data.- 5.8.5.4 Selection Model-With Followup Data.- 5.9 Further Importance Sampling Ideas.- 5.9.1 Sampling from the Predictive Identity.- 5.9.2 Sequential Imputation.- 5.9.3 Calculating the Posterior.- 5.10 Sampling in the Context of Multinomial Data.- 5.10.1 Dirichlet Sampling.- 5.10.2 Latent Class Analysis.- 6 Markov Chain Monte Carlo: The Gibbs Sampler and the Metropolis Algorithm.- 6.1 Introduction to the Gibbs Sampler.- 6.1.1 Chained Data Augmentation.- 6.1.2 Multivariate Chained Data Augmentation-The Gibbs Sampler.- 6.1.3 Historical Comments.- 6.2 Examples.- 6.2.1 Rat Growth Data.- 6.2.2 Poisson Process with a Change Point.- 6.2.3 Generalized Linear Models with Random Effects.- 6.3 Assessing Convergence of the Chain.- 6.3.1 The Gibbs Stopper.- 6.3.2 Control/Variates.- 6.3.3 Alternative Methods.- 6.4 The Griddy Gibbs Sampler.- 6.4.1 Example.- 6.4.2 Adaptive Grid/Grid Grower.- 6.4.3 Nonlinear Regression.- 6.4.4 Cox Model.- 6.5 The Metropolis Algorithm.- 6.5.1 Elements of Discrete-space Markov Chains.- 6.5.2 Metropolis' Method.- 6.5.3 Metropolis Subchains.- 6.6 Conditional Inference via the Gibbs Sampler.- 6.6.1 Introduction.- 6.6.2 Skovgaard's Approximation.- 6.6.3 Two-way Contingency Tables.- 6.6.4 Example.- References.



This book provides a unified introduction to a variety of computational algorithms for likelihood and Bayesian inference. In this second edition, I have attempted to expand the treatment of many of the techniques dis­ cussed, as well as include important topics such as the Metropolis algorithm and methods for assessing the convergence of a Markov chain algorithm. Prerequisites for this book include an understanding of mathematical statistics at the level of Bickel and Doksum (1977), some understanding of the Bayesian approach as in Box and Tiao (1973), experience with condi­ tional inference at the level of Cox and Snell (1989) and exposure to statistical models as found in McCullagh and Neider (1989). I have chosen not to present the proofs of convergence or rates of convergence since these proofs may require substantial background in Markov chain theory which is beyond the scope ofthis book. However, references to these proofs are given. There has been an explosion of papers in the area of Markov chain Monte Carlo in the last five years. I have attempted to identify key references - though due to the volatility of the field some work may have been missed.


weitere Titel der Reihe