"Set for a year-end release, AV2 is not only an upgrade to the widely adopted AV1 but also a foundational piece of AOMedia’s future tech stack.

AV2, a generation leap in open video coding and the answer to the world’s growing streaming demands, delivers significantly better compression performance than AV1. AV2 provides enhanced support for AR/VR applications, split-screen delivery of multiple programs, improved handling of screen content, and an ability to operate over a wider visual quality range. AV2 marks a milestone on the path to an open, innovative future of media experiences."

  • LeFantome@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    11 hours ago

    The main thing I want is small file size for the quality. Netflix, YouTube, and me agree on that.

    Most of my stuff is AV1 today even though the two TVs I typically watch it on do not support it. Most of the time, what I am watching is high-bitrate H.264 that was transcoded from the low-bitrate AV1.

    I will probably move to AV2 shortly after it is available. At least, I will be an early adopter. The smaller the files the better. And, in the future when quality has gone up everywhere, my originals will play native and look great.

  • Hazzard@lemmy.zip
    link
    fedilink
    arrow-up
    4
    ·
    1 day ago

    Very cool! I’ve only just recently gotten to experience the joys of AV1 for my own game recordings (Linux is way ahead of Windows here), and dang is it nice. 10 minute flashback recordings of 4K HDR@60 for only 2.5GB, and the results look fantastic. Can just drag and drop it over to YouTube as well, it’s fully supported over there.

    Glad to see things moving, I’ll be eager to check this out in a few years once it has wider support!

    • rezad@lemmy.worldOP
      link
      fedilink
      arrow-up
      3
      ·
      22 hours ago

      you didn’t do the wrong thing.

      what many people don’t notice is that support for a codec in gpu(in hardware) is two part. one is decoding and one is encoding.

      for quality video nobody does hardware encoding (at least not on consumer systems linux this 3050 nvidia)

      for most users the important this is hardware support for decoding so that they can watch their 4k movie with no issue.

      so you are in the clear.

      you can watch av1 right now and when av2 becomes popular enough to be used in at least 4 years from now.

    • Majestic@lemmy.ml
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      2 days ago

      And which will be so resource intensive to encode with compared to existing standards that it’ll probably take 14 years before home media collectors (or yar har types) are able and willing to use it over HEVC and AV1. :\

      As an example AV1 encodes to this day are extremely rare in the p2p scene. Most groups still work with h264 or h265 even those focusing specifically on reducing sizes while maintaining quality. By contrast HEVC had significant uptake within 3-4 years of its release in the p2p scene (we’re on year 7 for AV1).

      These greedy, race to the bottom device-makers are still fighting AV1. With people keeping devices longer and not upgrading as much as well as tons of people relying on under-powered smart-TVs for watching (forcing streaming services to maintain older codecs like h264/h265 to keep those customers) means it’s going to take a depressingly long time to be anything but a web streaming phenomenon I fear.

  • Zer0_F0x@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    2 days ago

    Looking ahead, 53% of AOMedia members surveyed plan to adopt AV2 within 12 months upon its finalization later this year, with 88% expecting to implement it within the next two years.

    From AOMedia website. So the plan is for it to have AV1 levels of adoption by 2028.

  • Linkerbaan@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    2 days ago

    AV1 was mid. Extremely slow encoding and minor performance gains over H265. And no good encoders on release.

    H266 was miles ahead but that is propriatary like 265. So win some lose some.

    • Ferk@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      2 days ago

      Compression and efficiency is often a trade-off. H266 is also much slower than AV1, under same conditions. Hopefully there will come more AV1 hw encoders to speed things up… but at least the AV1 decoders are already relatively common.

      Also, the gap between h265 and AV1 is higher than between AV1 and h266. So I’d argue it’s the other way around. AV1 is reported to be capable of ~30-50% bitrate savings over h.265 at the cost of speed. H266 differences with AV1 are minor, it’s reported to get a similar range, but more balanced towards the 50% side and at the cost of even lower speed. I’d say once AV1 encoding hardware is more common and the higher presets for AV1 become viable it’d be a good balance for most cases.

      The thing is that h26x has a consortium of corporations behind with connections and an interest to ensure they can cash in on their investment, so they get a lot of traction to get hw out.

      • WolfLink@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        AV1 only has gains at very low quality settings. For high quality, h265 is much better. At least with the codecs available in ffmpeg, from my tests.

        • Ferk@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 day ago

          Note that high-quality + low-bitrate AV1 setup often requires using parameters that rise the time and processing power beyond what’s typically sensible in an average setup without hw encoder. And compared with h265 this would be even higher since not only is h265 less complex and faster to begin with, but it also is often hw accelerated.

          Here there’s a 2020 paper comparing various encoders for high quality on fullHD: https://www.researchgate.net/publication/340351958_MSU_Video_Codec_Comparison_2019_part_IV_High-Quality_Encoding_aom_rav1e_SVT-AV1_SVT-HEVC_SVT-VP9_x264_x265_ENTERPRISE_VERSION

          “First place in the quality competition goes to aom [AOMedia’s AV1 encoder], second place goes to SVT-AV1, and third place to x265”

          And av1 codecs are younger, so I wouldn’t be surprised if they have improved over the h265 ones since the article.

          Here’s the settings they used in aom, for reference:

          aomenc.exe --width=%WIDTH% --height=%HEIGHT% 
              --fps=%FPS_NUM%/%FPS_DENOM% --bit-depth=8 --end-usage=vbr 
              --cpu-used=0 --target-bitrate=%BITRATE_KBPS% --ivf --threads=32 
              --tune=ssim -o %TARGET_FILE% %SOURCE_FILE%
          
          • WolfLink@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            15 hours ago

            I can try it again, but what I did was compare h265 to SVT-AV1 in ffmpeg, using a couple different clips of different styles (including a video from my phone and some ripped blu-ray movies). I used “constant quality / variable bitrate settings, and ran each file with a variety of settings for both encoders. I judged the videos with a quality comparison tool ffmpeg has, and I also took subjective notes when I could tell the difference.

            I found AV1 did better at very low quality (when it was firmly into the region where it was visibly different, AV1 did have better quality per bitrate).

            But when trying to produce high-quality clips, AV1 was never able to produce a clip that matched the quality score of h265, even when the bitrate of the AV1 file was higher.

  • utopiah@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    So… a lot more people now have :

    • 4G/5G on the go and proper broadband at home and office and even in unique location (sadly via MuskSat for now…) other ways to get data
    • very capable devices in mobile phones, (mostly Android) clients e.g. video projector or dongles, of course computers
    • human eyes… that can’t really appreciate 4K on average

    … so obviously we should NOT stop looking for more efficient ways and new usages but I’m also betting that we are basically reaching diminishing return already. I don’t think a lot of people care anymore about much high screen resolution or frequency for typical video streaming. Because that’s the most popular usage I imagine everything else, e.g XR, becomes relative to it niche and thus has a hard time benefiting as much from the growth in performances we had until now.

    TL;DR: OK cool but aren’t we already flattening the curve on the most popular need anyway?

    • MrMcGasion@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      2 days ago

      It’s not for the end user at this point, it’s for YouTube/streaming companies to spend less on bandwidth at existing resolutions. Even a 5% decrease in size for similar quality could save millions in bandwidth costs over a year for YouTube or Netflix.

      • utopiah@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        Thanks for the clarification, it makes me wonder though, is it bandwidth saving at no user cost? i.e is the compression improved without requiring more compute at the end to decompress?

        • MrMcGasion@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          2 days ago

          Without hardware decoding, it will take more compute to decompress, but sites usually wait to fully roll out new codecs until hardware decoding is more ubiquitous, because of how many people use low-powered streaming sticks and Smart TVs.

          • utopiah@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            2 days ago

            Then it’s arguably delegating some of the cost to the final user, large streaming companies spending a bit less on IXP contracts while viewers have to have newer hardware that might need a bit more energy too to run.

            • Ferk@lemmy.ml
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              18 hours ago

              On the upside, the end user needs to use up less data for the same content. This is particularly interesting under 4G/5G and restrictive data plans, or when accessing places / servers with weak connection. It helps avoid having to wait for the “buffering” of the video content mid-playback.

              But yes, I agree each iteration has diminishing returns, with a higher bump in requirements. I feel that’s a pattern we see often.

    • Ferk@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      2 days ago

      This! Also there’s AI upscaling, if good enough it could (in theory) make a 1080p video show with a 4k quality only very few lucky and healthy young people would be able to tell apart. In the meantime, my eyesight progressively gets worse with age.