Google opensource’d Guetzli

Earlier this week the folks at Google published a blog post announcing that they’ve decoded to opensource Guetzli, their JPEG encoder which supposedly is much better than anything else available.

The scope of this release was to help webmasters compress their images better so that they have less impact over the page loading time. Besides that, reducing the image size would also help reduce the storage space on the server, especially in the case of very large sites.

Since I’ve already been using JPEGOptim to get the maximum compression out of my image files and sometimes used Mogrify to resize them I’ve decided to give Guetzli a try given that there’s always place for improvements so I installed the dependencies and cloned the Guetzli repo on one of my production servers.

Compiling it was fairly quick so without bothering too much with the release docs I jumped into testing it on one of my images, a 3872×2592 JPEG image which had a size of 1612 KB (I’m using KB instead of MB because the values are much more accurate):After running Guetzli, which by the way means cookie in Swiss German, I noticed that the process takes an awful time to complete even on a server with 4GB RAM available and nearly no load.

Half an hour later there was no change as the process was still running with a server load of 1.5 (which wasn’t too much by the way) so I checked the original docs (yea, I know RTFM first!) and noticed that they were mentioning the fact that Guetzli is heavy on resources as it requires 300MB of virtual memory for every million of pixels referenced in the picture. This means that my image with 10036224 pixels requires over 3GB of virtual memory to process the conversion. Wow, that’s heavy.

Lucky for me, the Linux memory throttle did not let Guetzli use up all the available memory so that it could do the conversion instantly and had it work it out throughout approximately 40 minutes.

The result? Well you have it below, the same picture which was reduced to 1356 KB:

With this specific and complex image which was chosen at random I managed to reduce the JPEG size by 256 KB which is roughly 15% improvement. It ain’t much judging by the initial expectations and considering that the developers reference a decrease between 20% and 30% while in some cases an average of 35% is referenced as well, however in the economy of server resources 15% is a great improvement on such complex images on top of what JPEGOptim was able to do.

The only problem I can see with Guetzli is represented by the fact that due to it being so resource intensive it cannot be used on production servers or for optimizing multiple images simultaneously which creates a drawback when it comes to using it for bulk image optimization.

P.S. Guetzli also works for PNG images and converts them to JPEG. However if you’re using it on PNG’s with transparent background, the conversion will replace the transparency with a black background given that the JPEG format does not support the alpha layer.