back to article Be our Guetzli, says Google, to make beastly JPEGs beautifully small

Google's revealed details of a new JPEG encoder it calls “Guetzli” and which it says can shrink images by between 29 and 45 per cent without making them appreciably less pretty. Google bothers with this stuff because it wants web pages to load faster, so that people use the web more and see more of its ads. To that end, the …

  1. Charlie Clark Silver badge

    Research is good

    Sounds like an attempt to backport some of the optimisations used in WebP to the older format. Google definitely favours WebP for images as it is a single format that does both lossy and lossless compression well and supports alpha channels. Unfortunately, there are still holdouts against WebP: Apple and Microsoft unsurprisingly but disappointingly also Firefox, though it supports the parent WebM format.

    As for economies: I think Facebook did the numbers about the money it saves by compressing images as much as it does and it's more than you think.

    1. Dave 126 Silver badge

      Re: Research is good

      The money the server saves has to be balanced by users' reactions when they find their mobile phone battery depletes more quickly. That said, there are a lot of websites that clearly give not a damn about their readers time or battery life.

      I'm sure the brainy folk at Google have ways of estimating how much this can be mitigated by implementing the decoder in hardware.

      1. Charlie Clark Silver badge

        Re: Research is good

        The money the server saves has to be balanced by users' reactions when they find their mobile phone battery depletes more quickly.

        JPEG, like most codecs, is asymmetric: it requires more work to encode rather decode. The paper notes that currently the codec is too slow to encode to be used in production. Not that bitmap decoding is particularly onerous or relevant for battery use: lighting the screen to display the image is almost certainly harder on the battery.

        Hardware acceleration is always possible but unlikely here: JPEG has other problems inherent in the way it works which is why there are several formats vying to replace it. Of these, Webp is derived from video codecs for which there is already hardware acceleration. Webp already compares favourably with JPEG in time/quality tests.

    2. Dave 126 Silver badge

      Re: Research is good

      Ohh, I'm not sure that this is being added to the JPEG format, it's just that the input will be a JPEG.

      1. Corin

        Re: Research is good

        Looking at the github page (https://github.com/google/guetzli) it is specifically an encoder for JPEGs. The decoding is trivial - if anything, it'll use less battery since there will be less data to transfer & thus the cell radio can power down sooner.

        The real cost is the one-off computation to produce this smaller JPEG file. Potentially worth it for someone like the BBC to save bandwidth costs for images on their news site, less so for a low-traffic blog.

    3. Anonymous Coward
      Anonymous Coward

      Re: webp

      Actually Firefox went ahead with webp implementation, and is at a fairly advanced stage.

      Also Chrome is adding support for apng. Dogs and cats living together!

      1. Charlie Clark Silver badge

        Re: webp

        Actually Firefox went ahead with webp implementation

        Thanks for the heads up. Apple apparently also tried it last year only to remove it from the final release.

    4. Anonymous Coward
      Anonymous Coward

      Re: Research is good

      Google do not help the case for WebP by not easily making libraries available for Microsoft development (given MS not being bothered to rush to support it), and the libraries not being very user friendly. Most C# devs create a wrapper around the libwebp.dll (or use a wrapper someone else has created)

      If using something is (fast and easy (in dev yerms)) then take up will rocket, if its a lengthy PITA then most devs won't bother as cannot justify it tp PHBs

  2. Anonymous Coward
    Anonymous Coward

    [smaller jpegs] "so that people use the web more and see more of its ads"

    If the ads were only simple jpegs, I just might not filter them quite so much.

    1. John Brown (no body) Silver badge

      Re: [smaller jpegs] "so that people use the web more and see more of its ads"

      It's the current web design model that makes it so easy to block ads. They are links to third party site so you just leave those blocked in NoScript while you, the party of the first part, allow the 2nd party site you are visiting. WEbdesingers do this with most of their site add-ons, even typefaces. If they linked their server side to the 3rd party add-ons, then we'd only see them as coming from the actual site visited and that would make blocking them harder. Oops. Shhhhhh....don't tell them. Of course, they won't do that really. It uses up their bandwidth and resources instead of ours, or they might have to work for a living instead of just linking all over the world to other peoples work.

  3. Anonymous Coward
    Anonymous Coward

    There are two slow downs....

    1. Images that are massive in size (e.g. 5000x4000) then they just resize the image on the page, instead of shrinking the image beforehane.

    2. The biggest slowdown? Adverts.

    1. John Brown (no body) Silver badge

      Re: There are two slow downs....

      "1. Images that are massive in size (e.g. 5000x4000) then they just resize the image on the page, instead of shrinking the image beforehane."

      I work "in the Channel" and we regularly get internal company emails with attached doc[x] or pdf files with very large images embedded that are scaled to fit by the document formatting. I could almost forgive if this was from marketing, but tech support do it too. I'm hoping that one day an important email from someone important will be sent which gets rejected by the fairly spare attachment size limit imposed on our email servers.

      1. Anonymous Coward
        Anonymous Coward

        Re: There are two slow downs....

        @John Brown

        "I work in the Channel"

        Wow ferry driving is more technical than I thought.

  4. Corin

    Giving it a try

    Compiled very quickly on this i5 Macbook Pro.

    I pointed it at a 5888x3912 8bit colour PNG source, with quality set to 95. It pegged one CPU core at 99%, with memory usage just shy of 5GB. 23 minutes later, I had myself a nice small JPEG. Looked perfect!

    I concur with the researchers - this is worth doing where the bandwidth savings will be significant, or if your application demands a very small file size. But for general purpose use it's probably much easier not to bother. Focus the attention on the 2MB javascript file and all the work that spends doing instead!

    1. Charlie Clark Silver badge

      Re: Giving it a try

      Now do a comparison with WebP…

  5. Anonymous Coward
    Anonymous Coward

    A better solution...

    ... quit filling web pages with useless images....

    1. batfastad

      Re: A better solution...

      El Reg should take the lead here by stopping adding vaguely-related stock photography to the top of each article.

      1. Charlie Clark Silver badge

        Re: A better solution...

        Increasingly it's not the images that are the problem it's the SPA (single page app) development approach which try to turn websites into applications and loads essentially a mini-operating system to handle stuff that may never happen. You can see this when apparently text-only websites become 5 MB because framework-of-the-week and all its friends have to be loaded. Sort of like the "quick start" option for MS Office even for people who never use Office.

      2. Simon Sharwood, Reg APAC Editor (Written by Reg staff)

        Re: Re: A better solution...

        FWIW, when all we have is vaguely-related stock, we have the option to use a tag so it doesn't appear. I do so for most stories. The idea is that if the image doesn't help to tell the story, we don't use it. We also have a size limit on header images to help page load times.

  6. hellwig

    What if I need the original quality of the image?

    Compressing for how the human brain perceives images is one thing, but what if I'm trying to download images for manipulation with image editing software (you know, stealing someone's profile picture off Facebook and photo-shopping in embarrassing things to use for my presidential campaign). How do I, a simple user, ask my web browser to ask the server for the original image?

  7. War Puppy

    guetzli testing

    Same PNG resized for testing ( 500,1000,2000 px), normal JPG compression too fast to worry over. Skylake 6700. Set at quality 90.

    As you scale up, the time to compress gets 'entertaining'. Can see where Ryzen cores/threads will be useful if I want to do my 18,000 images. The 2000px one throws up an oddity. Needs more work.

    1.2M test-1000.png

    5.0M test-2000.png

    292K test-500.png

    ----

    200K c-1000.jpg

    1000K c-2000.jpg

    56K c-500.jpg

    ---

    168K g-1000.jpg

    1.1M g-2000.jpg

    40K g-500.jpg

    500 ======================

    real 0m7.588s

    user 0m7.270s

    sys 0m0.317s

    1000 ======================

    real 0m46.546s

    user 0m44.248s

    sys 0m2.291s

    2000 ======================

    real 5m4.220s

    user 4m51.535s

    sys 0m12.659s

  8. Kevin McMurtrie Silver badge

    Better JPEG/MPEG/you-name-it encoders have been around for a long time. The trick is avoiding the patent trolls.

  9. Jim-234

    Needs to be able to efficiently use multiple CPU cores before it's worth anything

    From doing a quick play around with it, the results are promising in terms of size & visual quality.

    However until it can properly use as many CPU cores as you can throw at it, I don't see it catching on.

    Now if it could scale across your CPUs, so your 8 to 16 core workstation can chomp through it in a minute or less, then it might see use for web designers.

    1. P. Lee

      Re: Needs to be able to efficiently use multiple CPU cores before it's worth anything

      >Now if it could scale across your CPUs, so your 8 to 16 core workstation can chomp through it in a minute or less

      What about a graphics card?

  10. TeeCee Gold badge
    WTF?

    Hmm.

    I can clearly see in this article that there's one thing that Google obviously have working that they're not telling us about.

    Something that emulates pulling letters out of a Scrabble bag at random to form new words.

  11. PushF12
    Pint

    Google Photos

    Will they be using this codec to recode Google Photo libraries?

    Props to Google for doing this kind of hard computer science.

  12. the Jim bloke

    has no one else noticed Butteraugli

    " then uses a Google-developed model of human vision called “Butteraugli” to figure out "

    pronounces Butt-r-ugly

    someone has a sense of humour (college level)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like