Google is still working on improving its image processing. If a few months ago we had RAISR, the technique with which the Great G thought to create high resolution images from samples of much lower quality, today the Mountain View company is once again offering us novelties in this field.
According to they have published through the Google Research Blog they are working with a new encoder for JPEG files called Guetzli, with which they intend to reduce the size of these files by 35% with respect to the files of this format that are currently generated.
But perhaps what is most interesting for developers is that Google has opened the Guetzli code, as it has been published in The Next Web. If you are a developer and you are interested in implementing this compression algorithm, you can consult the documentation and download everything from GitHub.
In the examples that are seen on these lines, in the left part of the image is the original uncompressed, the same image after going through the libjpeg algorithm and the last is the result of Guetzli.
This compression algorithm works with browsers and existing image tools. From Google it is said that it will allow generating images of a size not too large, without compromising the quality of the image.
For this they will focus on what is known as the “quantization phase” and the compression process itself. This is where algorithms can greatly reduce images, but you can also achieve very low qualities if you do not work carefully.
As the company itself points out, this method of reduction is similar to another compression algorithm that they announced a while ago, called Zofli. It reduced the size of PNG and GZIP files without the need to create a new format. Now, we must bear in mind that Guetzli is slower than other compression algorithms.