Bültmann & Gerriets
Information and Coding Theory
von J. Mary Jones, Gareth A. Jones
Verlag: Springer London
Reihe: Springer Undergraduate Mathematics Series
Hardcover
ISBN: 978-1-85233-622-6
Auflage: 2000
Erschienen am 26.06.2000
Sprache: Englisch
Format: 235 mm [H] x 155 mm [B] x 13 mm [T]
Gewicht: 353 Gramm
Umfang: 228 Seiten

Preis: 48,14 €
keine Versandkosten (Inland)


Dieser Titel wird erst bei Bestellung gedruckt. Eintreffen bei uns daher ca. am 12. November.

Der Versand innerhalb der Stadt erfolgt in Regel am gleichen Tag.
Der Versand nach außerhalb dauert mit Post/DHL meistens 1-2 Tage.

48,14 €
merken
zum E-Book (PDF) 48,14 €
klimaneutral
Der Verlag produziert nach eigener Angabe noch nicht klimaneutral bzw. kompensiert die CO2-Emissionen aus der Produktion nicht. Daher übernehmen wir diese Kompensation durch finanzielle Förderung entsprechender Projekte. Mehr Details finden Sie in unserer Klimabilanz.
Klappentext
Inhaltsverzeichnis

As this Preface is being written, the twentieth century is coming to an end. Historians may perhaps come to refer to it as the century of information, just as its predecessor is associated with the process of industrialisation. Successive technological developments such as the telephone, radio, television, computers and the Internet have had profound effects on the way we live. We can see pic­ tures of the surface of Mars or the early shape of the Universe. The contents of a whole shelf-load of library books can be compressed onto an almost weight­ less piece of plastic. Billions of people can watch the same football match, or can keep in instant touch with friends around the world without leaving home. In short, massive amounts of information can now be stored, transmitted and processed, with surprising speed, accuracy and economy. Of course, these developments do not happen without some theoretical ba­ sis, and as is so often the case, much of this is provided by mathematics. Many of the first mathematical advances in this area were made in the mid-twentieth century by engineers, often relying on intuition and experience rather than a deep theoretical knowledge to lead them to their discoveries. Soon the math­ ematicians, delighted to see new applications for their subject, joined in and developed the engineers' practical examples into wide-ranging theories, com­ plete with definitions, theorems and proofs.



1. Source Coding.- 1.1 Definitions and Examples.- 1.2 Uniquely Decodable Codes.- 1.3 Instantaneous Codes.- 1.4 Constructing Instantaneous Codes.- 1.5 Kraft's Inequality.- 1.6 McMillan's Inequality.- 1.7 Comments on Kraft's and McMillan's Inequalities.- 1.8 Supplementary Exercises.- 2. Optimal Codes.- 2.1 Optimality.- 2.2 Binary Huffman Codes.- 2.3 Average Word-length of Huffman Codes.- 2.4 Optimality of Binary Huffman Codes.- 2.5 r-ary Huffman Codes.- 2.6 Extensions of Sources.- 2.7 Supplementary Exercises.- 3. Entropy.- 3.1 Information and Entropy.- 3.2 Properties of the Entropy Function.- 3.3 Entropy and Average Word-length.- 3.4 Shannon-Fano Coding.- 3.5 Entropy of Extensions and Products.- 3.6 Shannon's First Theorem.- 3.7 An Example of Shannon's First Theorem.- 3.8 Supplementary Exercises.- 4. Information Channels.- 4.1 Notation and Definitions.- 4.2 The Binary Symmetric Channel.- 4.3 System Entropies.- 4.4 System Entropies for the Binary Symmetric Channel.- 4.5 Extension of Shannon's First Theorem to Information Channels.- 4.6 Mutual Information.- 4.7 Mutual Information for the Binary Symmetric Channel.- 4.8 Channel Capacity.- 4.9 Supplementary Exercises.- 5. Using an Unreliable Channel.- 5.1 Decision Rules.- 5.2 An Example of Improved Reliability.- 5.3 Hamming Distance.- 5.4 Statement and Outline Proof of Shannon's Theorem.- 5.5 The Converse of Shannon's Theorem.- 5.6 Comments on Shannon's Theorem.- 5.7 Supplementary Exercises.- 6. Error-correcting Codes.- 6.1 Introductory Concepts.- 6.2 Examples of Codes.- 6.3 Minimum Distance.- 6.4 Hamming's Sphere-packing Bound.- 6.5 The Gilbert-Varshamov Bound.- 6.6 Hadamard Matrices and Codes.- 6.7 Supplementary Exercises.- 7. Linear Codes.- 7.1 Matrix Description of Linear Codes.- 7.2 Equivalence ofLinear Codes.- 7.3 Minimum Distance of Linear Codes.- 7.4 The Hamming Codes.- 7.5 The Golay Codes.- 7.6 The Standard Array.- 7.7 Syndrome Decoding.- 7.8 Supplementary Exercises.- Suggestions for Further Reading.- Appendix A. Proof of the Sardinas-Patterson Theorem.- Appendix B. The Law of Large Numbers.- Appendix C. Proof of Shannon's Fundamental Theorem.- Solutions to Exercises.- Index of Symbols and Abbreviations.


andere Formate
weitere Titel der Reihe