Information Theory — Quantifying Information

Information theory, founded by Claude Shannon in 1948, provides the mathematical framework for measuring information and the limits of communication and compression.

Key Quantities

  • Entropy H(X) — the average information content.
  • Mutual information I(X;Y).
  • Channel capacity — Shannon's theorem.
  • Source coding — Huffman, arithmetic coding.
  • Error-correcting codes — Hamming, Reed–Solomon, LDPC.