Google released their algorithm named Guetzli which is claimed to encode high resolution, big JPEG files by up to 35% smaller in size.
Google has been investing a lot to make the web a better place. They have developed all sorts of fancy-named methods, markups, and tools to improve website performance and thus to improve the user experience. They created AMP ( accelerated mobile pages ), they developed brotli which have better compression rate than world’s favorite GZip.
Guetzli JPEG Encoder
Before we get into this, all new fancy algorithm let me tell you that Guetzli [guɛtsli] is basically a swiss german word which means cookie.
Google has pointed out that Guetzli can only generate nonprogressive ( sequential ) JPEG images mainly because of their faster decompression speed.
Another statement also says that images which are produced by the Guetzli algorithm are usually 20% to 30% smaller in size as compared to those which are produced by libjpeg library.
Overall I must say that Google is doing a fantastic job in making the web more faster and better. Afterall 70% to 80% of any website’s page size is comprised of images. And if there are suitable options available for developers and webmasters to compress those files, then apparently you and me as a regular internet user will experience a different type of web which will be much faster and will consume less bandwidth.
How Guetzli algorithm works?
For all those programming nerds who love technical details, Guetzli works very similarly to Google’s Zopfli algorithm. In Zopfli smaller png along with gzip files are produced and their’s no need for any new file format, unlike WebP or RNN-based image compression techniques.
Guetzli works on the quantization stage of image compression. Just like any other compression method, it produces smaller images by reducing its visual quality but with a twist.Guetzli uses a search algorithm to compare its psychovisual model with ”
Guetzli uses a search algorithm to compare its psychovisual model with “psychovisual modeling” meant for JPEG files. This feature allows it to work on the next level of image optimization which is far beyond the basic color manipulation and mathematical transformations. This comparison allows Guetzli to work on color perception and masking which thus results in better file compression but at the cost of longer processing time.
Can I implement it now?
Now this is really ineteresting because I feel that this algorithm is not suitable for everyone. Even though the size of generated images are smaller, libjpeg is still faster and more appropriate for regular use.
Now you might have a question why?
Well, because Google has also stated that “Guetzli uses a large amount of memory”. You need to allocate at least 300MB of ram for every 1MB of image. Now this is not practical for me especially if I want to implement it on my server.
I have almost 3GB of images on my server. So basically if I ever plan to compress them using this new also by running 4 instances at the same time and assuming that the average file size to be 0.5MB, my server is going to need at least 600MB of contineous ram to process all my files, and I don’t know how long will it take for completion of the process.
Leave a Reply