Bültmann & Gerriets
Image Analysis, Random Fields and Dynamic Monte Carlo Methods
A Mathematical Introduction
von Gerhard Winkler
Verlag: Springer Berlin Heidelberg
Reihe: Stochastic Modelling and Applied Probability Nr. 27
E-Book / PDF
Kopierschutz: PDF mit Wasserzeichen

Hinweis: Nach dem Checkout (Kasse) wird direkt ein Link zum Download bereitgestellt. Der Link kann dann auf PC, Smartphone oder E-Book-Reader ausgeführt werden.
E-Books können per PayPal bezahlt werden. Wenn Sie E-Books per Rechnung bezahlen möchten, kontaktieren Sie uns bitte.

ISBN: 978-3-642-97522-6
Auflage: 1995
Erschienen am 06.12.2012
Sprache: Englisch
Umfang: 324 Seiten

Preis: 53,49 €

53,49 €
merken
Inhaltsverzeichnis

I. Bayesian Image Analysis: Introduction.- 1. The Bayesian Paradigm.- 1.1 The Space of Images.- 1.2 The Space of Observations.- 1.3 Prior and Posterior Distribution.- 1.4 Bayesian Decision Rules.- 2. Cleaning Dirty Pictures.- 2.1 Distortion of Images.- 2.1.1 Physical Digital Imaging Systems.- 2.1.2 Posterior Distributions.- 2.2 Smoothing.- 2.3 Piecewise Smoothing.- 2.4 Boundary Extraction.- 3. Random Fields.- 3.1 Markov Random Fields.- 3.2 Gibbs Fields and Potentials.- 3.3 More on Potentials.- II. The Gibbs Sampler and Simulated Annealing.- 4. Markov Chains: Limit Theorems.- 4.1 Preliminaries.- 4.2 The Contraction Coefficient.- 4.3 Homogeneous Markov Chains.- 4.4 Inhomogeneous Markov Chains.- 5. Sampling and Annealing.- 5.1 Sampling.- 5.2 Simulated Annealing.- 5.3 Discussion.- 6. Cooling Schedules.- 6.1 The ICM Algorithm.- 6.2 Exact MAPE Versus Fast Cooling.- 6.3 Finite Time Annealing.- 7. Sampling and Annealing Revisited.- 7.1 A Law of Large Numbers for Inhomogeneous Markov Chains.- 7.1.1 The Law of Large Numbers.- 7.1.2 A Counterexample.- 7.2 A General Theorem.- 7.3 Sampling and Annealing under Constraints.- 7.3.1 Simulated Annealing.- 7.3.2 Simulated Annealing under Constraints.- 7.3.3 Sampling with and without Constraints.- III. More on Sampling and Annealing.- 8. Metropolis Algorithms.- 8.1 The Metropolis Sampler.- 8.2 Convergence Theorems.- 8.3 Best Constants.- 8.4 About Visiting Schemes.- 8.4.1 Systematic Sweep Strategies.- 8.4.2 The Influence of Proposal Matrices.- 8.5 The Metropolis Algorithm in Combinatorial Optimization.- 8.6 Generalizations and Modifications.- 8.6.1 Metropolis-Hastings Algorithms.- 8.6.2 Threshold Random Search.- 9. Alternative Approaches.- 9.1 Second Largest Eigenvalues.- 9.1.1 Convergence Reproved.- 9.1.2 Sampling and Second Largest Eigenvalues.- 9.1.3 Continuous Time and Space.- 10. Parallel Algorithms.- 10.1 Partially Parallel Algorithms.- 10.1.1 Synchroneous Updating on Independent Sets.- 10.1.2 The Swendson-Wang Algorithm.- 10.2 Synchroneous Algorithms.- 10.2.1 Introduction.- 10.2.2 Invariant Distributions and Convergence.- 10.2.3 Support of the Limit Distribution.- 10.3 Synchroneous Algorithms and Reversibility.- 10.3.1 Preliminaries.- 10.3.2 Invariance and Reversibility.- 10.3.3 Final Remarks.- IV. Texture Analysis.- 11. Partitioning.- 11.1 Introduction.- 11.2 How to Tell Textures Apart.- 11.3 Features.- 11.4 Bayesian Texture Segmentation.- 11.4.1 The Features.- 11.4.2 The Kolmogorov-Smirnov Distance.- 11.4.3 A Partition Model.- 11.4.4 Optimization.- 11.4.5 A Boundary Model.- 11.5 Julesz's Conjecture.- 11.5.1 Introduction.- 11.5.2 Point Processes.- 12. Texture Models and Classification.- 12.1 Introduction.- 12.2 Texture Models.- 12.2.1 The ?-Model.- 12.2.2 The Autobinomial Model.- 12.2.3 Automodels.- 12.3 Texture Synthesis.- 12.4 Texture Classification.- 12.4.1 General Remarks.- 12.4.2 Contextual Classification.- 12.4.3 MPM Methods.- V. Parameter Estimation.- 13. Maximum Likelihood Estimators.- 13.1 Introduction.- 13.2 The Likelihood Function.- 13.3 Objective Functions.- 13.4 Asymptotic Consistency.- 14. Spacial ML Estimation.- 14.1 Introduction.- 14.2 Increasing Observation Windows.- 14.3 The Pseudolikelihood Method.- 14.4 The Maximum Likelihood Method.- 14.5 Computation of ML Estimators.- 14.6 Partially Observed Data.- VI. Supplement.- 15. A Glance at Neural Networks.- 15.1 Introduction.- 15.2 Boltzmann Machines.- 15.3 A Learning Rule.- 16. Mixed Applications.- 16.1 Motion.- 16.2 Tomographic Image Reconstruction.- 16.3 Biological Shape.- VII. Appendix.- A. Simulation of Random Variables.- A.1 Pseudo-random Numbers.- A.2 Discrete Random Variables.- A.3 Local Gibbs Samplers.- A.4 Further Distributions.- A.4.1 Binomial Variables.- A.4.2 Poisson Variables.- A.4.3 Gaussian Variables.- A.4.4 The Rejection Method.- A.4.5 The Polar Method.- B. The Perron-Frobenius Theorem.- C. Concave Functions.- D. A Global Convergence Theorem for Descent Algorithms.- References.


weitere Titel der Reihe