Encoding JPGs with Google's Guetzli

When I first read about Guetzli I thought that it probably wouldn't be so much, that sounded too good to be true; A 35% size reduction in JPGs, backwards-compatible and with no visible quality loss sounded quite nice. Then, a few days ago we were talking at work about how to optimize our homepage's speed and I reminded about the algorithm, and decided to give it a try over the weekend.

My results can be quickly summarized: It is as great compressing as slow doing it. But recommended if speed is not an issue for you.

Basically, it consumes around 200MB of RAM and takes one minute per source image megapixel. This means that a tiny 150x200 thumbnail takes more than 15 seconds to encode on my laptop, while ImageMagick encodes a FullHD PNG into JPG in around a second. It is indeed extremely slow, making it unfeasible for encoding on the fly blog post images and the like. Nothing that cannot be solved with an asynchronous job that sweeps and optimizes afterwards, but extra work to be done.

Regarding the size, I can confirm I got size reductions between 35% and 50%, sometimes a bit more. Source images were all JPGs with quality between 85% and 95% and sizes between 150x150 and 1900x1280. Overall, the images "block" went from 290 MB to 110 MB. Some of the images probably were not optimally compressed as when I used Windows I had a PowerShell script doing the resize instead of ImageMagick, but still it is a huge win, and I cannot even imagine how well it will reduce images of blogs where non-tech people upload images sometimes as they come without any quality or resolution resize.

I ran the big re-encoding of full folders at 85% quality, with the following exact run params (available here too):

find . -type f -name "*.jpg" -exec guetzli --quality 85 {} {}.jpeg \;
find . -type f -name "*.jpg" -exec rm {} +
find . -type f -name "*.jpeg" | rename "s/\.jpeg$//"

As I was going through hundreds of images coming from years of different sources (phones, cameras, image manipulation applications...), I had a small ratio of failures around 1.7% (if I recall correctly the number of failed images) of almost 2900 JPGs. Guetzli as a commnad-line application is really basic, and the verbosity is too scarce to be of use in batching (if you care about errors, that is), so I decided to add some dumb fprintfs and output failed files, so with some replacements and regular expressions I could easily grab all failed images. My fork can be found here.

And really that's all I had to say regarding it. Clearly the opensourced tool is not the same stuff Google uses, as there is a single test and I refuse to believe they would use that command line tool in production, but the algorithm is there, is as awesome as advertised and I'll definetly optimize my blog post images from now on with it.

Encoding JPGs with Google's Guetzli published @ . Author: