Information Theory is studied from the following view points: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean ...
Read More
Information Theory is studied from the following view points: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources. The main purpose of this book is to present information channels in the environment of real and functional analysis as well as probability theory. Ergodic channels are characterized in various manners. Mixing and AMS channels are also considered in detail with some illustrations. A few other aspects of information channels including measurability, approximation and noncommutative extensions, are also discussed.
Read Less
Add this copy of Abstract Methods in Information Theory (Multivariate to cart. $49.95, very good condition, Sold by Solr Books rated 5.0 out of 5 stars, ships from Skokie, IL, UNITED STATES, published 1999 by World Scientific Publishing Company.
Add this copy of Abstract Methods in Information Theory (Multivariate to cart. $94.30, good condition, Sold by Bonita rated 4.0 out of 5 stars, ships from Newport Coast, CA, UNITED STATES, published 1999 by World Scientific Publishing Co.