2

I 'm implementing jpeg compression for microcontroller. In the huffman coding step I decided to use the standard fixed huffman table. I 've read somewhere that these tables are suitable for general images, but cannot find a statistical number about how much percentage do they reduce data size ? If it's just reduce the number of zero then I can do it with a few line of code.

qand
  • 49
  • 1
  • 9
  • But what is the fixed Huffman table? I cannot find it the link you posted bellow. Does it work with general images? – nhoxbypass May 21 '19 at 07:34
  • 2
    It depends on how much your image statistically fits the average image that was used to create the default Huffman table. Most of the compression for Jpeg comes from the DCT and quantization, so unless you have a pathological case (very noisy image) you probably will only use a few percent of compression using standard tables. You will have to test it with representative samples from your own data to find out. Asking "how much exactly" doesn't make sense without knowing exact input data... – jrudolph Mar 15 '21 at 09:38

1 Answers1

1

Standard huffman tables simply reduce compression time. If you do not use a pre-built table, your encoder has to scan the DCT data twice. The first pass generates counts to build the huffman table. The second pass encodes the huffman table. A two-pass scan generates the optimal huffman table and reduces data size more than a pre-canned on in most cases (by how much?).

PNG has standard huffman tables. If you use those, (unlike JPEG), you do not have to encode the table in the compressed image, thus saving space their.

user3344003
  • 20,574
  • 3
  • 26
  • 62
  • 1
    You didn't answer my question. As I read here (http://www.impulseadventure.com/photo/optimized-jpeg.html), JPEG standard use standard table too, not just PNG. If I use those standard table, the process becomes replacing a byte with a bit string (not necessary less than 8 bit). My question is does that process really compress data ? (count out the effect of encode zero run length) – qand Oct 26 '15 at 07:53
  • JPEG does not use standard huffman tables. The standard includes sample huffman tables as an appendix. You have to use a huffman table of some sort encoded in the JPEG stream, be it a sample in the standard or one of your own devising. You do not have to use the sample Huffman tables in the standard. And yes, it generally compresses data. PNG has the option of using standard huffman tables or ones you generate yourself. A PNG decoder must be aware of the standard table. A JPEG decoder does not need to have any knowledge of the tables in the JPEG standard. – user3344003 Oct 26 '15 at 23:58
  • "And yes, it generally compresses data" -> I doubt using a huffman table without analyzing data will compress anything. Do you have any proof, or a source ? – qand Oct 27 '15 at 09:16
  • If you generate your own tables, it clearly compresses non-random data. According to your source, the precanned tables in the JPEG standard are only slightly worse than ideal tables. – user3344003 Oct 27 '15 at 21:02
  • It doesn't answer my question, how much does it compress if I use standard table, _count out the effect of encode zero run length._ – qand Oct 28 '15 at 03:33