Hi,
That's not always true, Huffman coding does not always achieve the entropy limit because it works with integer code lengths. Since Huffman codes must be whole numbers of bits, the average length can exceed the entropy slightly. Entropy represents the ideal case which might involve fractional bits. Also, If the source has a non-uniform distribution or symbols with probabilities that are not powers of two, the Huffman code's average length will typically be greater than the entropy.