Limitation of Huffman coding

In Huffman coding method, the coding rate is the average number of bits used to represent a symbol from a source and, for a given probability model, the entropy is the lowest rate at which the source can be coded. We can tighten this bound somewhat.

The Huffman algorithm will generate a code whose rate is within pmax   +0.086 of the entropy where pmax  is the probability of the most frequently occurring symbol.

In applications where the alphabet size is large, pmax  is generally quite small, and the amount of deviation from the entropy, especially in terms of a percentage of the rate, is quite small.

However, in cases where the alphabet is small and the probability of occurrences of the different letters is akewed, the value of pmax   can be quite large and the Huffman code can become rather inefficient when compared to the entropy.

One way is to avoid this problem is to block more than one symbol together and generates an extended Huffman code.


Application of huffman coding

Comments