A First Course in Information Theory

A First Course in Information Theory is an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relat...

Full description

Bibliographic Details
Main Author: Yeung, Raymond W.
Format: eBook
Language:English
Published: New York, NY Springer US 2002, 2002
Edition:1st ed. 2002
Series:Information Technology: Transmission, Processing and Storage
Subjects:
Online Access:
Collection: Springer Book Archives -2004 - Collection details see MPG.ReNa
LEADER 05859nmm a2200457 u 4500
001 EB000616446
003 EBX01000000000000000469528
005 00000000000000.0
007 cr|||||||||||||||||||||
008 140122 ||| eng
020 |a 9781441986085 
100 1 |a Yeung, Raymond W. 
245 0 0 |a A First Course in Information Theory  |h Elektronische Ressource  |c by Raymond W. Yeung 
250 |a 1st ed. 2002 
260 |a New York, NY  |b Springer US  |c 2002, 2002 
300 |a XXIII, 412 p  |b online resource 
505 0 |a 10.1 Alternating Optimization -- 10.2 The Algorithms -- 10.3 Convergence -- Problems -- Historical Notes -- 11. Single-Source Network Coding -- 11.1 A Point-to-Point Network -- 11.2 What is Network Coding? -- 11.3 A Network Code -- 11.4 The Max-Flow Bound -- 11.5 Achievability of the Max-Flow Bound -- Problems -- Historical Notes -- 12. Information Inequalities -- 12.1 The Region ?*n -- 12.2 Information Expressions in Canonical Form -- 12.3 A Geometrical Framework -- 12.4 Equivalence of Constrained Inequalities -- 12.5 The Implication Problem of Conditional Independence -- Problems -- Historical Notes -- 13. Shannon-Type Inequalities -- 13.1 The Elemental Inequalities -- 13.2 A Linear Programming Approach -- 13.3 A Duality -- 13.4 Machine Proving -- 13.5 Tackling the Implication Problem -- 13.6 Minimality of the Elemental Inequalities -- Appendix 13.A: The Basic Inequalities and the Polymatroidal Axioms -- Problems -- Historical Notes -- Problems -- Historical Notes --  
505 0 |a 1. The Science of Information -- Information Measures -- 2.1 Independence and Markov Chains -- 2.2 Shannon’s Information Measures -- 2.3 Continuity of Shannon’s Information Measures -- 2.4 Chain Rules -- 2.5 Informational Divergence -- 2.6 The Basic Inequalities -- 2.7 Some Useful Information Inequalities -- 2.8 Fano’s Inequality -- 2.9 Entropy Rate of Stationary Source -- Problems -- Historical Notes -- 3. Zero-Error Data Compression -- 3.1 The Entropy Bound -- 3.2 Prefix Codes -- 3.3 Redundancy of Prefix Codes -- Problems -- Historical Notes -- 4. Weak Typicality -- 4.1 The Weak -- 4.2 The Source Coding Theorem -- 4.3 Efficient Source Coding -- 4.4 The Shannon-McMillan-Breiman Theorem -- Problems -- Historical Notes -- 5. Strong Typicality -- 5.1 Strong -- 5.2 Strong Typicality Versus Weak Typicality -- 5.3 Joint Typicality -- 5.4 An Interpretation of the Basic Inequalities -- Problems -- Historical Notes -- The I-measure -- 6.1 Preliminaries --  
505 0 |a 14. Beyond Shannon-Type Inequalities -- 14.1 Characterizations of ?*2,?*3, and ?*n -- 14.2 A Non-Shannon-Type Unconstrained Inequality -- 14.3 A Non-Shannon-TypeConstrained Inequality -- 14.4 Applications -- Problems -- Historical Notes -- 978-1-4419-8608-5_15 -- 15.1 Two Characteristics -- 15.2 Examples of Application -- 15.3 A Network Code for Acyclic Networks -- 15.4 An Inner Bound -- 15.5 An Outer Bound -- 15.6 The LP Bound and Its Tightness -- 15.7 Achievability of Rin -- Appendix 15.A: Approximation of Random Variables with Infinite Alphabets -- Problems -- Historical Notes -- 16. Entropy and Groups -- 16.1 Group Preliminaries -- 16.2 Group-Characterizable Entropy Functions -- 16.3 A Group Characterization of ?*n -- 16.4 Information Inequalities and Group Inequalities -- Problems -- Historical Notes 
505 0 |a 6.2 The I-Measure for Two Random Variables -- 6.3 Construction of the I-Measure ?* -- 6.4 ?* Can be Negative -- 6.5 Information Diagrams -- 6.6 Examples of Applications -- Appendix 6.A: A Variation of the Inclusion-Exclusion Formula -- Problems -- Historical Notes -- 7. Markov Structures -- 7.1 Conditional Mutual Independence -- 7.2 Full Conditional Mutual Independence -- 7.3 Markov Random Field -- 7.4 Markov Chain -- Problems -- Historical Notes -- 8. Channel Capacity -- 8.1 Discrete Memoryless Channels -- 8.2 The Channel Coding Theorem -- 8.3 The Converse -- 8.4 Achievability of the Channel Capacity -- 8.5 A Discussion -- 8.6 Feedback Capacity -- 8.7 Separation of Source and Channel Coding -- Problems -- Historical Notes -- 9. Rate-Distortion Theory -- 9.1 Single-Letter Distortion Measures -- 9.2 The Rate-Distortion Function R(D) -- 9.3 The Rate-Distortion Theorem -- 9.4 The Converse -- 9.5 Achievability of RI(D) -- Problems -- Historical Notes -- The Blahut-Arimoto Algorithms --  
653 |a Group Theory and Generalizations 
653 |a Engineering 
653 |a Group theory 
653 |a Computer science / Mathematics 
653 |a Discrete Mathematics in Computer Science 
653 |a Electrical and Electronic Engineering 
653 |a Electrical engineering 
653 |a Control theory 
653 |a Probability Theory 
653 |a Systems Theory, Control 
653 |a System theory 
653 |a Discrete mathematics 
653 |a Technology and Engineering 
653 |a Probabilities 
041 0 7 |a eng  |2 ISO 639-2 
989 |b SBA  |a Springer Book Archives -2004 
490 0 |a Information Technology: Transmission, Processing and Storage 
028 5 0 |a 10.1007/978-1-4419-8608-5 
856 4 0 |u https://doi.org/10.1007/978-1-4419-8608-5?nosfx=y  |x Verlag  |3 Volltext 
082 0 |a 621.3 
520 |a A First Course in Information Theory is an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory. ITIP, a software package for proving information inequalities, is also included. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields