Basics of Huffman Coding

huffman-basics-cover

Battletoads Huffman Example Images: Provided by Brad Smith. YouTube Huffman Compression Video: Provided on Travelschlepp’s Channel.

I originally started this article almost a month ago, but I am finally getting around to finishing it. Today I am going to tackle the topic of Huffman (en)coding. Huffman encoding/decoding are compression algorithms designed to compress and decompress data. I first started tinkering with Huffman coding when I ran across this post on the NesDev forum by Brad Smith aka “RainWarrior”. In the post, Brad investigates the Huffman code he noticed while disassembling the NES classic Battletoads.

How does Huffman coding work?

I am going to try to avoid the extremely technical terms. I apologize if this is a “watered down” synopsis, but if it is hard to understand, make it simple!

Part 1: Compressing the Data

Part 2: Decompressing the Data

What happened to keeping it simple?

Do not worry! When I first started studying Huffman coding, it took me a whole weekend to understand the process. So, I have created some visual examples to drive home the concept.

huffman-map

Above, is a simple image of some random tiles arranged into a map. For the sake of example, let us assume that each tile takes 1 byte to represent. Since the map consists of a total of one-hundred tiles, then that means the map takes 100 bytes to represent (100 tiles x 1 byte per tile). So our goal for this Huffman compression is to reduce the size of the mapper to less than 100 bytes!

To begin, we must analyze our basic symbols. In this case, our symbols are the tiles themselves. We have a total of 5 symbols: wood, stone, grass, dirt, and water. Then, we must calculate their frequencies in the map. This has been completed in the image below.

huffman-freq

Now, we must construct our Huffman tree and encode each tile. For this step, here is wonderful YouTube video that describes the process using letters instead of map tiles. Before I found this video, I was in the dark… so hopefully it will be equally helpful for you. Continuing with this example, I have included the steps to create a single node in the Huffman tree. You continue doing the process until all symbols have been added to the tree.

huffman-creating-nodes

huffman-tree

At this point, the bulk of the algorithm is over. Our last step is to encode a symbol (tile) by traversing the Huffman tree. To encode the entire map, we would need to encode each tile; however, since this process becomes trivial, I have only shown how to encode one wood tile. Also, the decompression is exactly the same, but in the reverse order.

huffman-traversal

So, in the above example we have saved 5 bits for the single wood tile. By doing some quick math, we can see that for wood tiles we will save a total of 45 bits (9 wood tiles x 5 bits saved per wood tile) for the entire map. If we run the algorithm for the full map, we will save a total of 574 bits or 71.75 bytes (the new map will be only 28.25 bytes). Thus, our compression rate is about 71.75%, not too shabby! Or to sound more impressive, you could say the original map is more than 3 times the size of our compressed map!

You will find that the effectiveness of Huffman coding (like all compression algorithms) depends on your data set. There are many articles which describe the optimal use-cases for such an algorithm, but I will leave that topic for another day. In closing, I will leave you with this following image provided by Brad Smith… the Huffman tree used by Battletoads. Awesome!

huffman-battletoads