Dover Publications, 1957. — 128 p.
The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists, and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.
In his first paper, Dr. Khinchin develops the concept of entropy in probability theory as a measure of uncertainty of a finite “scheme,” and discusses a simple application to coding theory. The second paper investigates the restrictions previously placed on the study of sources, channels, and codes and attempts “to give a complete, detailed proof of both … Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory.”
The Entropy Concept in Probability TheoryEntropy of Finite Schemes
The Uniqueness Theorem
Entropy of Markov chains
Fundamental Theorems
Application to Coding -Theory
On the Fundamental Theorems of Information TheoryElementary InequalitiesTwo generalizations of Shannon's inequality
Three inequalities of Feinstein
Ergodic SourcesConcept of a source Stationarity Entropy
Ergodic Sources
The E-property McMillan's theorem
The martingale concept Doob's theorem
Auxiliary propositions
Proof of McMillan's theorem
Channels and the sources driving themConcept of channel Noise Stationarity Anticipation and memory
Connection of the channel to the source
The ergodic case
Femstein's Fundamental LemmaFormulation of the problem
IS Proof of the lemma
Shannon's TheoremsCoding
The first Shannon theorem
The second Shannon theorem