Table of Contents:
  • 1 Information Sources
  • 2 Entropy and Information
  • 3 The Entropy Ergodic Theorem
  • 4 Information Rates I
  • 5 Relative Entropy
  • 6 Information Rates II
  • 7 Relative Entropy Rates
  • 8 Ergodic Theorems for Densities
  • 9 Channels and Codes
  • 10 Distortion
  • 11 Source Coding Theorems
  • 12 Coding for noisy channels