Classical information

Classical information was first defined rigorously by Claude Shannon. Information is equal to how much communication is needed to convey it. Roughly speaking, if one has a list of possible messages you might want to convey, then the information of the messages is how much communication is required to tell someone which of the messages from the list you wish to communicate.

We denote the messages by a random variable X. This is a list of messages {x1, x2, x3...xm} each one occuring with probability {p(x1), p(x2), p(x3)...p(xm)}. We denote this probability distribution by PX.

Shannon showed that the number of bits needed to convey which message occurs is given by the Shannon entropy H(X) =  − ∑p(x)logp(x). Essentially, he showed that if one has n messages, then one can compress this information onto a space of dimension just over 2nH(X) such that with high probability the information is sent faithfully.

Reference

Category:Classical Information Theory

Last modified: 

Monday, October 26, 2015 - 17:37