Relative Information Theories and Applications

For four decades, information theory has been viewed almost exclusively as a theory based upon the Shannon measure of uncertainty and information, usually referred to as Shannon entropy. Since the publication of Shannon's seminal paper in 1948, the theory has grown extremely rapidly and has bee...

Full description

Bibliographic Details
Main Author: Jumarie, Guy
Format: eBook
Language:English
Published: Berlin, Heidelberg Springer Berlin Heidelberg 1990, 1990
Edition:1st ed. 1990
Series:Springer Series in Synergetics
Subjects:
Online Access:
Collection: Springer Book Archives -2004 - Collection details see MPG.ReNa
Table of Contents:
  • 2.9.2 Meaning of the Renyi Entropy
  • 2.9.3 Some Properties of the Renyi Entropy
  • 2.10 Cross-Entropy or Relative Entropy
  • 2.10.1 The Main Definition
  • 2.10.2 A Few Comments
  • 2.11 Further Measures of Uncertainty
  • 2.11.1 Entropy of Degree c
  • 2.11.2 Quadratic Entropy
  • 2.11.3 R norm Entropy
  • 2.11.4 Effective Entropy
  • 2.12 Entropies of Continuous Variables
  • 2.12.1 Continuous Shannon Entropy
  • 2.12.2 Some Properties of Continuous Entropy
  • 2.12.3 Continuous Transinformation
  • 2.12.4 Further Extensions
  • 2.13 Hatori’s Derivation of Continuous Entropy
  • 2.14 Information Without Probability
  • 2.14.1 A Functional Approach
  • 2.14.2 Relative Information
  • 2.15 Information and Possibility
  • 2.15.1 A Few Prerequisites
  • 2.15.2 A Measure of Uncertainty Without Probability
  • 2.16 Conclusions
  • 3. A Critical Review of Shannon Information Theory
  • 3.1 Introduction
  • 3.2 On the Invariance of Measures of Information
  • 2.2 Shannon Measure of Uncertainty
  • 2.2.1 The Probabilistic Framework
  • 2.2.2 Shannon Informational Entropy
  • 2.2.3 Entropy of Random Variables
  • 2.3 An Intuitive Approach to Entropy
  • 2.3.1 Uniform Random Experiments
  • 2.3.2 Non Uniform Random Experiments
  • 2.4 Conditional Entropy
  • 2.4.1 Framework of Random Experiments
  • 2.4.2 Application to Random Variables
  • 2.5 A Few Properties of Discrete Entropy
  • 2.6 Prior Characterization of Discrete Entropy
  • 2.6.1 Properties of Uncertainty
  • 2.6.2 Some Consequences of These Properties
  • 2.7 The Concept of Information
  • 2.7.1 Shannon Information
  • 2.7.2 Some Properties of Transinformation
  • 2.7.3 Transinformation of Random Variables
  • 2.7.4 Remarks on the Notation
  • 2.8 Conditional Transinformation
  • 2.8.1 Main Definition
  • 2.8.2 Some Properties of Conditional Transinformation
  • 2.8.3 ConditionalTransinformation of Random Variables
  • 2.9 Renyi Entropy
  • 2.9.1 Definition of Renyi Entropy
  • 4.10 On the Meaning and the Estimation of the Observation Parameter
  • 4.10.1 Estimation of the Observation Parameter
  • 4.10.2 Practical Meaning of the Observation Parameter
  • 4.10.3 On the Value of the Observation Parameter u(R)
  • 4.11 Relative Transinformation
  • 4.11.1 Derivation of Relative Transinformation
  • 4.11.2 Some Properties of the Relative Transinformation
  • 4.11.3 Relative Entropy and Information Balance
  • 4.11.4 Application to the Encoding Problem
  • 4.12 Minkowskian Relative Transinformation
  • 4.12.1 Definition of Minkowskian Relative Transinformation
  • 4.12.2 Some Properties of Minkowskian Relative Transinformation
  • 4.12.3 Identification via Information Balance
  • 4.13 Effect of Scaling Factor in an Observation with Informational Invariance
  • 4.14 Comparison with Renyi Entropy
  • 4.14.1 Renyi Entropy and Relative Entropy
  • 4.14.2 Transinformation of Order c and Relative Transinformation
  • 5. A Theory of Subjective Information
  • 5.1 Introduction
  • 6.10.3 Application to an Extension Principle
  • 6.11 Total Entropy and Mixed Theory of Information
  • 6.11.1 Background to Effective Entropy and Inset Entropy
  • 6.11.2 Inset Entropy is an Effective Entropy
  • 7. A Unified Approach to Informational
  • 6.4.2 A Model via Uniform Interval of Definition
  • 6.5 Total Entropy with Respect to a Measure
  • 6.6 Total Renyi Entropy
  • 6.6.1 Preliminary Remarks About Renyi Entropy
  • 6.6.2 Axioms for Total Renyi Entropy
  • 6.6.3 Total Renyi Entropy
  • 6.6.4 Total Renyi Entropy with Respect to a Measure
  • 6.7 On the Practical Meaning of Total Entropy
  • 6.7.1 General Remarks
  • 6.7.2 Total Entropy and Relative Entropy
  • 6.7.3 TotalEntropy and Subjective Entropy
  • 6.8 Further Results on the Total Entropy
  • 6.8.1 Some Mathematical Properties of the Total Entropy
  • 6.8.2 On the Entropy of Pattern
  • 6.9 Transinformation and Total Entropy
  • 6.9.1 Total Entropy of a Random Vector
  • 6.9.2 Conditional Total Entropy
  • 6.9.3 On the Definition of Transinformation
  • 6.9.4 Total Kullback Entropy
  • 6.10 Relation Between Total Entropy and Continuous Entropy
  • 6.10.1 Total Shannon Entropy and Continuous Entropy
  • 6.10.2 Total Renyi Entropy and Continuous Renyi Entropy
  • 3.8 The Maximum Entropy Principle
  • 3.8.1 Statement of the Principle
  • 3.8.2 Some Examples
  • 3.9 Arguments to Support the Maximum Entropy Principle
  • 3.9.1 Information Theoretical Considerations
  • 3.9.2 Thermodynamic Considerations
  • 3.9.3 Axiomatic Derivation
  • 3.9.4 A Few Comments
  • 3.10 Information, Syntax, Semantics
  • 3.10.1 On the Absolute Nature of Information
  • 3.10.2 Information and Natural Language
  • 3.11 Information and Thermodynamics
  • 3.11.1 Informational and Thermodynamic Entropy
  • 3.11.2 Thermodynamic Entropy of Open Systems
  • 3.12 Conclusions
  • 4. A Theory of Relative Information
  • 4.1 Introduction
  • 4.2 Observation, Aggregation, Invariance
  • 4.2.1 Principle of Aggregation
  • 4.2.2 Principle of Invariance
  • 4.2.3 A Few Comments
  • 4.3 Observation with Informational Invariance
  • 4.4 Euclidean Invariance
  • 4.4.1 Orthogonal Transformation
  • 4.4.2 Application to the Observation of Probabilities
  • 4.4.3 Application to the Observation of Classes
  • 4.5 Minkowskian Invariance
  • 4.5.1 Lorentz Transformation
  • 4.5.2 Application to the Observation of Probabilities
  • 4.5.3 Application to the Observation of Classes
  • 4.6 Euclidean or Minkowskian Observation?
  • 4.6.1 Selection of the Observation Mode
  • 4.6.2 Application to the [Uncertainty, Information] Pair
  • 4.7 Information Processes and Natural Languages
  • 4.7.1 On the Absoluteness of Information
  • 4.7.2 The Information Process
  • 4.7.3 Natural Language
  • 4.7.4 Information and Natural Language
  • 4.8 Relative Informational Entropy
  • 4.8.1 Introduction to the Relative Observation
  • 4.8.2 Informational Invariance of the Observation
  • 4.8.3 Relative Entropy
  • 4.8.4 Comments and Remarks
  • 4.9 Conditional Relative Entropy
  • 4.9.1 Relative Entropy of Product of Messages
  • 4.9.2 Composition Law for Cascaded Observers
  • 4.9.3 Relative Entropy Conditional to a Given Experiment
  • 4.9.4 Applications to Determinacy
  • 4.9.5 Comparison of H(?/?) with Hr(?/?)
  • 1. Relative Information — What For?
  • 1.1 Information Theory, What Is It?
  • 1.1.1 Summary of the Story
  • 1.1.2 Communication and Information
  • 1.2 Information and Natural Language
  • 1.2.1 Syntax, Semantics, Lexeme
  • 1.2.2 Information, Learning, Dislearning
  • 1.3 Prerequisites for a Theory of Information
  • 1.3.1 Relativity of Information
  • 1.3.2 Negative Information
  • 1.3.3 Entropy of Form and Pattern
  • 1.3.4 Information and Thermodynamics
  • 1.3.5 Information and Subjectivity
  • 1.4 Information and Systems
  • 1.4.1 A Model of General Systems
  • 1.4.2 A Model of Relative Information
  • 1.4.3 A Few Comments
  • 1.5 How We Shall Proceed
  • 1.5.1 Aim of the Book
  • 1.5.2 Subjective Information and Relative Information
  • 1.5.3 Minkowskian Observation of Events
  • 1.5.4 A Unified Approach to Discrete Entropy and Continuous Entropy
  • 1.5.5 A Word of Caution to the Reader
  • 2. Information Theory — The State of the Art
  • 2.1 Introduction
  • 3.3 On the Modelling of Negative Transinformation
  • 3.3.1 Classification of Terms
  • 3.3.2 The Problem of Modelling “True” and “False”
  • 3.3.3 Error-Detecting Codes
  • 3.4 On the Symmetry of Transinformation
  • 3.4.1 A Diverting Example
  • 3.4.2 Application of Information Theory
  • 3.4.3 On the Symmetry of Transinformation
  • 3.4.4 On a Possible Application of Renyi Entropy
  • 3.5 Entropy and the Central Limit Theorem
  • 3.5.1 The Central Limit Theorem
  • 3.5.2 An Information Theoretical Approach to the Central Limit Theorem
  • 3.5.3 Relation with Thermodynamics
  • 3.5.4 Continuous Entropy Versus Discrete Entropy
  • 3.6 On the Entropy of Continuous Variables
  • 3.6.1 The Sign of the Continuous Entropy
  • 3.6.2 A Nice Property of Continuous Entropy
  • 3.7 Arguments to SupportContinuous Entropy
  • 3.7.1 On the Negativeness of Continuous Entropy
  • 3.7.2 On the Non-invariance of Continuous Entropy
  • 3.7.3 Channel Capacity in the Presence of Noise
  • 5.2 Subjective Entropy
  • 5.2.1 Definition of Subjective Entropy
  • 5.2.2 A Few Remarks
  • 5.3 Conditional Subjective Entropy
  • 5.3.1 Definition of Conditional Subjective Entropy
  • 5.3.2 Application to Determinacy
  • 5.3.3 A Basic Inequality
  • 5.4 Subjective Transinformation
  • 5.4.1 Introduction
  • 5.4.2 Subjective Transinformation
  • 5.4.3 A Few Properties of Subjective Transinformation
  • 5.4.4 Application to Independent Random Experiments
  • 5.5 Conditional Subjective Transinformation
  • 5.5.1 Definition
  • 5.5.2 A FewProperties of Subjective Conditional Transinformation
  • 5.6 Information Balance
  • 5.6.1 Optimum Conditions for Information Balance
  • 5.6.2 Non-optimum Conditions for Information Balance
  • 5.7 Explicit Expression of Subjective Transinformation
  • 5.7.1 Discrete Probability
  • 5.7.2 Continuous Probability
  • 5.8 The General Coding Problem
  • 5.8.1 Preliminary Remarks
  • 5.8.2 The General Coding Problem
  • 5.8.3 On the Problem of Error Correcting Codes Revisited
  • 5.9 Capacity of a Channel
  • 5.9.1 The General Model
  • 5.9.2 Channel with Noise
  • 5.9.3 Channel with Noise and Filtering
  • 5.10 Transinformation in the Presence of Fuzziness
  • 5.10.1 On the Entropy of a Fuzzy Set
  • 5.10.2 Application of Subjective Transinformation
  • 5.10.3 The Brillouin Problem
  • 5.11 On the Use of Renyi Entropy
  • 5.11.1 Renyi Entropy and Subjective Entropy
  • 5.11.2 Transinformation of Order c and Shannon Transinformation
  • 6. A Unified Approach to Discrete and Continuous Entropy
  • 6.1 Introduction
  • 6.2 Intuitive Derivation of “Total Entropy”
  • 6.2.1 Preliminary Definitions and Notation
  • 6.2.2 Physical Derivation of He (X)
  • 6.3 Mathematical Derivation of Total Entropy
  • 6.3.1 The Main Axioms
  • 6.3.2 Derivation of Total Entropy
  • 6.3.3 On the Expression of the Total Entropy
  • 6.4 Alternative Set of Axioms for the Total Entropy
  • 6.4.1 Generalization of Shannon Recurrence Equation