• floquant@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 hours ago

    Good, Chrome has allowed them to be in a position to dictate the future of internet standards for too long. Fuck em

  • kadu@scribe.disroot.org
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    4
    ·
    19 hours ago

    Google tried to sabotage it in favour of its own WebP standard.

    Nobody gives a shit about WebP.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      Their dumb counter argument was AVIF, for the same reason as webp which is reduced total size since they’re a massive cloud provider and want to penny pinch costs wherever possible.

      JPEG-XL also has a progressive loading algorithm that increases the quality as you download instead of classic vertical scan loading, which is sick but I don’t think any browser actually bothered to implement it yet after the Google hit job.

      Hopefully will see it in action soon.

      • kadu@scribe.disroot.org
        link
        fedilink
        English
        arrow-up
        43
        arrow-down
        2
        ·
        18 hours ago

        They don’t have to. It’s backwards compatible. You can ignore it and we can keep on happily using it.

        Fuck Google, fuck WebP.

          • rjek@feddit.uk
            link
            fedilink
            English
            arrow-up
            33
            ·
            17 hours ago

            It’s “compatible” in that it can represent old JPEG/JFIF data more efficiently and in less space, and the transformation to JPEG XL and back to JPEG/JFIF is lossless (in that you don’t lose any /more/ quality, you can get the same bits back out) and quick enough to be doable on-demand. You could, for example, re-encode all your old photos on your CDN as JPEG XL without loss of quality but save a bunch of disc space and bandwidth when serving to modern browsers, and translate dynamically back to the old format for older browers, all with no loss of quality.

              • rjek@feddit.uk
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 hours ago

                No, I’m saying that JPEG XL can perfectly represent old JPEG/JFIF data, so on the server side you can store all your image data once and more efficiently, and still support old clients without any lossy cascade or the CPU load of having to re-encode. That is what is meant about it offering backwards compatibility.

              • reddig33@lemmy.world
                link
                fedilink
                English
                arrow-up
                7
                ·
                edit-2
                11 hours ago

                What they’re saying is that a web server can create a traditional jpeg file from a jpeg xl to send to a client as needed. So you’re saving backend storage space… sometimes. Until widespread adoption by browsers, you’re still creating and transmitting a traditional jpeg file. And now you’ve increased the server space needed because you’re having to create and store two copies of the file in two different formats.

                Developers are already doing this with webp and everyone hates webp (if your browser doesn’t support webp, the backend sends you the jpeg copy). I dont see any advantage here except some hand waving “but in the future” just like has been done for most new formats trying to win adoption.

                • Logi@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  10 hours ago

                  The difference (claimed by the comment above) is in the words

                  without loss of quality

                  So you can convert back and forth without the photo copy of a photo copy problem.

                  And you don’t have to store the second copy of the file except for caching of frequently fetched files which I’m sure will just be an nginx rule.

                • The_Decryptor@aussie.zone
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  9 hours ago

                  What they’re saying is that a web server can create a traditional jpeg file from a jpeg xl to send to a client as needed.

                  Other way around, you can convert a “web safe” JPEG file into a JXL one (and back again), but you can’t turn any random JXL file into a JPEG file.

                  But yeah, something like Lemmy could recompress uploaded JPEG images as JXL on the server, serving them at JXL to updated clients, and converting back to JPEG as needed, saving server storage and bandwidth with no quality loss.

        • reddig33@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          14 hours ago

          How is it backwards compatible? Everything I’ve read so far says the opposite — That it requires recoding the image into the new format, and keeping around or generating an old copy of the image in current jpeg format for older software.

          Are you saying a browser or app that currently only supports Jpeg can open and render a Jpeg-XL image?

          Edit: Yeah. It’s not backward compatible. And system admins are already doing the “make two copies of an image thing with webp and the current jpg format.

  • network_switch@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    ·
    18 hours ago

    It is pretty great for shrinking the filesize of my photos. It’s a part of the PDF standard now. It’ll get great support across the board eventually. Still pretty early in adoption