Coding Theorems of Information Theory

The imminent exhaustion of the first printing of this monograph and the kind willingness of the publishers have presented me with the opportunity to correct a few minor misprints and to make a number of additions to the first edition. Some of these additions are in the form of remarks scattered thro...

Full description

Bibliographic Details
Main Author: Wolfowitz, Jacob
Format: eBook
Language:English
Published: Berlin, Heidelberg Springer Berlin Heidelberg 1964, 1964
Edition:2nd ed. 1964
Series:Ergebnisse der Mathematik und ihrer Grenzgebiete. 2. Folge, A Series of Modern Surveys in Mathematics
Subjects:
Online Access:
Collection: Springer Book Archives -2004 - Collection details see MPG.ReNa
Table of Contents:
  • 1. Heuristic Introduction to the Discrete Memoryless Channel
  • 2. Combinatorial Preliminaries.
  • 2.1. Generated sequences
  • 2.2. Properties of the entropy function
  • 3. The Discrete Memoryless Channel
  • 3.1. Description of the channel
  • 3.2. A coding theorem
  • 3.3. The strong converse
  • 3.4. Strong converse for the binary symmetric channel
  • 3.5. The finite-state channel with state calculable by both sender and receiver
  • 3.6. The finite-state channel with state calculable only by the sender
  • 4. Compound Channels
  • 4.1. Introduction
  • 4.2. The canonical channel
  • 4.3. A coding theorem
  • 4.4. Strong converse
  • 4.5. Compound d.m.c. with c.p.f. known only to the receiver or only to the sender
  • 4.6. Channels where the c.p.f. for each letter is stochastically deter-mined
  • 4.7. Proof of Theorem 4.6 4
  • 4.8. The d.m.c. with feedback
  • 4.9. Strong converse for the d.m.c. with feedback
  • 5. The Discrete Finite-Memory Channel.
  • 5.1. The discrete channel
  • 8. The Semi-Continuous Memoryless Channel
  • 8.1. Introduction
  • 8.2. Strong converse of the coding theorem for the s.c.m.c
  • 8.3. Proof of Lemma 8.2.1
  • 8.4. The strong converse with (math) in the exponent
  • 9. Continuous Channels with Additive Gaussian Noise.
  • 9.1. A continuous memoryless channel with additive Gaussian noise
  • 9.2. Message sequences within a suitable sphere
  • 9.3. Message sequences on the periphery of the sphere or within a shell adjacent to the boundary
  • 9.4. Another proof of Theorems 9.2.1 and 9.2.2
  • 10. Mathematical Miscellanea
  • 10.1. Introduction
  • 10.2. The asymptotic equipartition property
  • 10.3. Admissibility of an ergodic input for a discrete finite-memory channel
  • 11. Group Codes. Sequential Decoding.
  • 11.1. Group Codes
  • 11.2. Canonical form of the matrix M
  • 11.3. Sliding parity check codes
  • 11.4. Sequential decoding
  • References
  • List of Channels Studied or Mentioned
  • 5.2. The discrete finite-memory channel
  • 5.3. The coding theorem for the d.f.m.c
  • 5.4. Strong converse of the coding theorem for the d.f.m.c
  • 5.5. Rapidity of approach to C in the d.f.m.c
  • 5.6. Discussion of the d.f.m.c
  • 6. Discrete Channels with a Past History.
  • 6.1. Preliminary discussion
  • 6.2. Channels with a past history
  • 6.3. Applicability of the coding theorems of Section 7.2 to channels with a past history
  • 6.4. A channel with infinite duration of memory of previously transmitted letters
  • 6.5. A channel with infinite duration of memory of previously received letters
  • 6.6. Indecomposable channels
  • 6.7. The power of the memory
  • 7. General Discrete Channels
  • 7.1. Alternative description of the general discrete channel
  • 7.2. The method of maximal codes
  • 7.3. The method of random codes
  • 7.4. Weak converses
  • 7.5. Digression on the d.m.c
  • 7.6. Discussion of the foregoing
  • 7.7. Channels without a capacity