ericportis.com

Resolution-Progressive Jpeg 2000 Performance

Introduction

Jpeg 2000 is an inherently multi-resolution image format: as part of its compression scheme images are decomposed into progressively lower-resolution approximations of the full-size image. One can order a Jpeg 2000’s codestream such that these approximations are sorted from lowest-to-highest resolution1; this structure seems perfectly suited for fetching responsive images over the web: if you have some sort of manifest telling you which bytes are where, you can make a partial request on the file and load a particular scaled-down version of the image. With an intelligent fetching mechanism in place, a multi-resolution image format would simplify authors’ lives significantly vs. any kind of multi-source responsive images solution, as there would be only one resource to maintain and list in markup.

Yoav wrote some more about the benefits of a single-source solution in his Responsive Image Container post.

But how effective would partially loading Jpeg 2000s actually be, byte-wise? TL;DR disappointingly ineffective. (But I might be missing something.)

Methodology

OpenJpeg is an open-source implementation of the Jpeg 2000 codec. I couldn't get the newest version (2.0) to generate the index files I needed see how the codestreams were structured, but 1.5.1 (installed with Homebrew) did, so I used that.

I used four uncompressed source images, of various sizes. These four images in no way provide a statistically significant sample of all internet imagery, but hey, you gotta start somewhere.

I did several things with each:

  1. Rendered a full-sized Jpeg with Photoshop CS6’s “Save for Web...” function, using these settings (most importantly, using a “quality” of 50).
  2. Used ImageMagick to determine the PSNR of the full-sized Jpeg, relative to the uncompressed source.2
  3. Used OpenJpeg to render a lossy Jpeg 2000 from the uncompressed source with a matching PSNR. OpenJpeg also generated an index file which detailed the structure of the resulting Jpeg 2000’s codestream, listing the byte-ranges that each of the progressively-downscaled resolutions occupied within the file.3
  4. Rendered resized, quality-50 Jpegs at reduced resolutions which matched the smaller-resolutions available within the Jpeg 2000, via Photoshop’s “Save for Web…” dialog.

I then tried to answer the question “should we have just sent a Jpeg?” by determining which weighed more: a partial chunk of the Jpeg 2000, or a Jpeg rendered at the same size.

Results

All of the files (Tiffs, Jpegs, Jpeg 2000s, indexes, etc.) are available for your perusal here.

Chart of the data in the table below
Obama nomination
Jpeg bytespartial Jpeg 2000 bytesJpeg 2000 bytes ÷ Jpeg bytes
fullsize (3008 × 2000)1,139,371775,92668%
half (1504 × 1000)320,034564,363176%
quarter (752 × 500)117,298282,644241%
eighth (376 × 250)37,343123,437282%
sixteenth (188 × 125)10,68642,286396%
32nd (94 × 63)3,13713,305424%
Chart of the data in the table below
Zion narrows
Jpeg bytespartial Jpeg 2000 bytesJpeg 2000 bytes ÷ Jpeg bytes
fullsize (3994 × 7945)4,216,1922,572,05961%
half (1997 × 3973)1,174,2142,099,997179%
quarter (999 × 1978)352,8051,180,486335%
eighth (500 × 994)104,064543,074522%
sixteenth (250 × 497)29,140210,328722%
32nd (125 × 249)8,88175,458850%
Chart of the data in the table below
USA Pro Challenge
Jpeg bytespartial Jpeg 2000 bytesJpeg 2000 bytes ÷ Jpeg bytes
fullsize (4608 × 3702)1,088,889382,80435%
half (2304 × 1536)354,058365,163103%
quarter (1152 × 768)127,350263,140207%
eighth (576 × 384)44,528152,565343%
sixteenth (288 × 192)15,19474,133488%
32nd (144 × 96)5,26031,820605%
Chart of the data in the table below
Snowman snowman
Jpeg bytespartial Jpeg 2000 bytesJpeg 2000 bytes ÷ Jpeg bytes
fullsize (1024 × 1024)87,44344,81051%
half (512 × 512)26,86933,446124%
quarter (256 × 256)8,92126,002291%
eighth (128 × 128)3,17215,635493%

Conclusions

The Jpeg 2000 lines above have a characteristic shape — whereas the Jpegs’ file-sizes increase geometrically (along with the number of pixels in the file) the Jpeg 2000 byte-sizes taper off gradually at the top end. Even though the full-res Jpeg 2000s are 25%-50% lighter than the full-res Jpegs for the same PSNR (!), loading a half-size subset of the .jp2 saves you only 5%-30% vs loading the whole darned thing. Meanwhile, the half-sized Jpeg has shrunk by a factor of four, and so in every case already weighs significantly less than it’s partially loaded, Jpeg 2000 counterpart. The “you should have just sent a Jpeg” savings increase with every reduction in size after that.

I don't understand the math behind the Jpeg 2000 format well enough to say with certainty what’s going on. If you can explain these results, please let me know!

I'll update this post if and when anybody supplies me with explanations, or with refutations of my conclusion, which is that Jpeg 2000 wouldn’t make a great responsive image format.

Update — 2014-01-15

Some of the “partial Jpeg 2000 bytes” numbers were off because I’d read them out of the wrong column in the index files (“end_ph_pos” instead of “end_pos”). Thus a few of the partial Jpeg 2000s appeared a few percent smaller than they really are. I‘ve fixed the tables and updated the charts.

Kornel Lesiński wrote me to say that PSNR is a poor metric for image quality, and that SSIM/DSSIM correlate more closely with human judgement. He further suggested that PSNR consistently over-rates Jpeg 2000’s compression abilities. This helps explain why the .jp2s were consistently blowing the Jpegs out of the water at full-size.

Frédéric Kayser replied on the RICG mailing list, pointing out that exporting via Photoshop’s “Save for Web…” at quality settings of 50 and below triggers chroma subsampling, and that I could try enabling chroma subsampling on the Jpeg 2000s as well by using OpenJpeg‘s -s option. I tried this and found two things:

  1. OpenJpeg only seems to chroma-subsample images whose quality has been specified as a compression ratio — a -s 2,2 flag seems to have no effect on images which have been encoded to a specific PSNR with the -q flag
  2. Chroma subsampling only seems to affect the byte sizes of the highest-couple-of-res layers within the encoded Jpeg 2000. The lower-resolution layers are not chroma-subsampled, and therefore enabling it does not help with the problem that I’ve identified here, which is the the poor efficiency of partial, low-res loads.

  1. Contrast this with Progressive JPEGs, which store lower quality approximations first. (Jpeg 2000s can do this, too; OpenJpeg encodes quality-progressive .jp2s by default).
  2. The PSNR-determining commands looked like this:

    $ compare -verbose -metric PSNR Obama_nomination.jpg ../source\ images/Obama_nomination.tif Obama_diff.png
    Obama_nomination.jpg Jpeg 3008x2000 3008x2000+0+0 8-bit sRGB 1.139MB 0.150u 0:00.150
    ../source images/Obama_nomination.tif TIFF 3008x2000 3008x2000+0+0 8-bit TrueColor sRGB 18.08MB 0.040u 0:00.039
    Image: Obama_nomination.jpg
      Channel distortion: PSNR
    red: 35.628
    green: 36.7014
    blue: 35.0391
    all: 35.7361
    writing raw profile: type=APP12, length=15
    writing raw profile: type=exif, length=22
    Obama_nomination.jpg=>Obama_diff.png Jpeg 3008x2000 3008x2000+0+0 8-bit sRGB 6.416MB 6.050u 0:06.049
  3. The Jpeg 2000 encoding commands looked like this:

    $ image_to_j2k -i source\ images/Obama_nomination.tif -x Obama_nomination.idx -o full\ jp2s/Obama_nomination.jp2 -p RLCP -n 6 -q 35.7361 -I
    [INFO] tile number 1 / 1
    [INFO] - tile encoded in 2.527199 s
    Generated outfile full jp2s/Obama_nomination.jp2
    Generated index file Obama_nomination.idx

    Here are some notes I made for myself on the options (more detail is available in the documentation):

    image_to_j2k 
      -i full.tiff // input file
      -x index.idx // index file
      -o out.jp2 // output file
      -p RLCP // resolution-progressive (default is quality-progressive)
      -n 6 // number of resolutions... 6 is default
      -q 40.1234 // PSNR of rendered file
      -I // lossy, default is lossless!