Information Theory and Signal Processing

Claude Shannon, a genius, devised a rigorous and mathematical method to quantify the expected value of the information content associated with an event, which he termed “entropy.”

In this method, Shannon employed the base-2 logarithm to denote the number of bits needed for representation. By summing this logarithm with the respective probability for each outcome, Shannon derived the expected value of the information content.

Huffman codes encapsulate the essence of Shannon’s entropy concept even I don’t know if Huffman came up with such idea before or after Shannon’s theory.

More advanced topics on Information Theory

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.