"Set for a year-end release, AV2 is not only an upgrade to the widely adopted AV1 but also a foundational piece of AOMedia’s future tech stack.
AV2, a generation leap in open video coding and the answer to the world’s growing streaming demands, delivers significantly better compression performance than AV1. AV2 provides enhanced support for AR/VR applications, split-screen delivery of multiple programs, improved handling of screen content, and an ability to operate over a wider visual quality range. AV2 marks a milestone on the path to an open, innovative future of media experiences."
Another codec, that will take a decade to get widespread adoption and hardware compatability. /sigh
And which will be so resource intensive to encode with compared to existing standards that it’ll probably take 14 years before home media collectors (or yar har types) are able and willing to use it over HEVC and AV1. :\
As an example AV1 encodes to this day are extremely rare in the p2p scene. Most groups still work with h264 or h265 even those focusing specifically on reducing sizes while maintaining quality. By contrast HEVC had significant uptake within 3-4 years of its release in the p2p scene (we’re on year 7 for AV1).
These greedy, race to the bottom device-makers are still fighting AV1. With people keeping devices longer and not upgrading as much as well as tons of people relying on under-powered smart-TVs for watching (forcing streaming services to maintain older codecs like h264/h265 to keep those customers) means it’s going to take a depressingly long time to be anything but a web streaming phenomenon I fear.
To be fair, it’s also basically impossible to have extremely high quality AV1 video, which is what a lot of P2P groups strive for. A lot of effort has gone into trying to do so and results weren’t good enough compared to x264, so it’s been ignored. AV1 is great at compression efficiency, but it can’t make fully transparent encodes (i.e., indistinguishable from the source). It might be different with AV2, though again even if it’s possible it may be ignored because of compatibility instead; groups still use DTS-HD MA over the objectively superior FLAC codec for surround sound because of hardware compatibility to this day. (1.0/2.0 channels they use FLAC because players support that usually)
As for HEVC/x265, it too is not as good as x264 at very high quality encoding, so it’s also ignored when possible. Basically the breakdown is that 4k encoding uses x265 in order to store HDR and because the big block efficiency of x265 is good enough to compress further than the source material. x264 wouldn’t be used for 4k encoding even if it could store HDR because its compression efficiency is so bad at higher resolutions that to have any sort of quality encode it would end up bigger than the source material. Many people don’t even bother with 4k x265 encodes and just collect the full disc/remuxes instead, because they dislike x265’s encoder quality and don’t deem the size efficiency worth its picture quality impact (pretty picky people here, and I’m not really in that camp).
For 1080p, x265 is only used when you want to have HDR in a 1080p package, because again x265’s picture quality can’t match x264, but most people deem HDR a bigger advantage. x264 is still the tool of choice for non-HDR 1080p encodes, and that’s not a culture thing, that’s just a quality thing. When you get down into public P2P or random encoding groups it’s anything goes, and x265 1080p encodes get a lot more common because x265 efficiency is pretty great compared to x264, but the very top-end quality just can’t match x264 in the hands of an experienced encoder, so those encoding groups only use x265 when they have to.
Edit: All that to say, we can’t entirely blame old-head culture or hardware compatibility for the unpopularity of newer formats. I think the home media collector usecase is actually a complete outlier in terms of what these formats are actually being developed for. WEB-DL content favors HEVC and AV1 because it’s very efficient and displays a “good enough” quality picture for their viewers. Physical Blu-Rays don’t have to worry about HDD space or bandwidth and just pump the bitrate insane on HEVC so that the picture quality looks great. For the record, VVC/x266 is already on the shortlist for being junk for the usecases described above (x266 is too new to fully judge), so I wouldn’t hold my breath for AV2 either. If you’re okay with non-transparency, I’d just stick with HEVC WEB-DLs or try to find good encoding groups that target a more opinionated quality:size ratio (some do actually use AV1!). Rules of thumb for WEB-DL quality are here, though it will always vary on a title-by-title basis.
I haven’t seen h264 used except for bd remuxes because the blu-ray standard uses h264
I can’t wait to possibly buy a card a decade from now with AV2 hardware decoding. Ain’t happening until after 2035 that’s for sure.
If we’re lucky some firmware upgrade or driver can make AV1 hardware decoding capable cards able to do AV2 as well, but I seriously doubt it - GPU manufacturers want to sell new cards all the time after all.We will probably have decoding in 5 years. Encoding may take longer.
The main thing I want is small file size for the quality. Netflix, YouTube, and me agree on that.
Most of my stuff is AV1 today even though the two TVs I typically watch it on do not support it. Most of the time, what I am watching is high-bitrate H.264 that was transcoded from the low-bitrate AV1.
I will probably move to AV2 shortly after it is available. At least, I will be an early adopter. The smaller the files the better. And, in the future when quality has gone up everywhere, my originals will play native and look great.
I want to agree with you and I do to a large extend. I like new codecs and having more opensourcy coded is better than using a codec that has many patents. long term patents(current situation) slows technological progress.
what I don’t agree with you is some details.
first, Netflix youtube and so on need low bitrate and they (specially google/youtube) don’t care that much about quality. google youtube video are really bit starved for their resolutions. netflix is a bit better.
second, many people when they discuss codecs they are referring to a different use case for them. they are talking about archiving. as in, the best quality codec at a same size. so they compare original (raw video, no lossy codec used) with encoded ones. their conclusion is that av1 is great for size reduction, but cant beat h264 for fidelity at any size. I think that h264 has a placebo or transparent profile but av1 doesn’t.
so when I download a fi…I mean a linux ISO from torrents, I usually go for newest codec. but recently I don’t go for the smallest size because it takes away from details in the picture.
but if I want to archive a movie (that I like a lot, which is rare) I get the bigger h264 (or if uhd blueray h265).
third: a lot of people’s idea of codec quality is formed based on downloading or streaming other people’s encoded videos and they themself don’t compare the quality (as they don’t have time or a good raw video to compare).
4th: I have heard av1 has issues with film grain, as in it removes them. film grain is an artifact of physical films (non-digital) that unfortunately many directors try (or used to) to duplicate because they grew up watching movies on films and think that movies should be like so they add them in in post production. even though it is literally a defect and even human eyes doesn’t duplicate it so it is not even natural. but this still is a bug of av1 (if I read correctly) because codec should go for high fidelity and not high smoothness.
AV1 has issues with film grain. There are things you can do. Let me admit however that one movie that I have not encoded as AV1 is a restored version of the original Star Wars. And film grain is a contributor to that.
Another thing about film grain though is that it is often artificially added after as you say. With AV1, you can often get amazing compression that removes the grain as a side-effect and then just add it back yourself. To each their own how they feel about this approach.
I also agree that H.264 can be more transparent. However, that is at massive file sizes. Others may have the space for that but I do not… Perhaps I do mot have the eyes for it either. I am not extracting and comparing single frames. To me, the AV1 files that I have look better at the size that I am archiving than they would using any other codec.
I use the fact that massive bit rate H.264 looks great to my advantage as that is what my AV1 is being transcoded into when I watch it most of the time.
Some content compresses better than others. Sometimes I get massive size reductions with AV1 at what looks like great quality to me. Other times, it struggles to beat H.265 or even H.264 at similar quality. It is pretty rare that I do not choose AV1 though.
I often use Netflix VMAF to get an idea of target compression. It is not perfect though. You have to verify visually. Saves time trialing different parameters though.
I should say that the audio codec is another big factor. I typically pair AV1 with Opus audio and the size reductions there are amazing even at quality levels that are transparent to me.
If AV2 offers better quality at the same size, or similar quality at smaller sizes, I will likely switch to it long before having hardware that can play it natively.
All these naysaysers in the comments here… It’s obvious you have to keep the development pipeline moving. Just because we have one free codec at the stage of hardware support now does not mean the world can stop. There are always multiple codecs out there at various stages of adoption, that’s just normal.
Looking ahead, 53% of AOMedia members surveyed plan to adopt AV2 within 12 months upon its finalization later this year, with 88% expecting to implement it within the next two years.
From AOMedia website. So the plan is for it to have AV1 levels of adoption by 2028.
We only just got hardware support for AV1 and they are already coming out with a new codec?
Very cool! I’ve only just recently gotten to experience the joys of AV1 for my own game recordings (Linux is way ahead of Windows here), and dang is it nice. 10 minute flashback recordings of 4K HDR@60 for only 2.5GB, and the results look fantastic. Can just drag and drop it over to YouTube as well, it’s fully supported over there.
Glad to see things moving, I’ll be eager to check this out in a few years once it has wider support!
AV1 was mid. Extremely slow encoding and minor performance gains over H265. And no good encoders on release.
H266 was miles ahead but that is propriatary like 265. So win some lose some.
Compression and efficiency is often a trade-off. H266 is also much slower than AV1, under same conditions. Hopefully there will come more AV1 hw encoders to speed things up… but at least the AV1 decoders are already relatively common.
Also, the gap between h265 and AV1 is higher than between AV1 and h266. So I’d argue it’s the other way around. AV1 is reported to be capable of ~30-50% bitrate savings over h.265 at the cost of speed. H266 differences with AV1 are minor, it’s reported to get a similar range, but more balanced towards the 50% side and at the cost of even lower speed. I’d say once AV1 encoding hardware is more common and the higher presets for AV1 become viable it’d be a good balance for most cases.
The thing is that h26x has a consortium of corporations behind with connections and an interest to ensure they can cash in on their investment, so they get a lot of traction to get hw out.
AV1 only has gains at very low quality settings. For high quality, h265 is much better. At least with the codecs available in ffmpeg, from my tests.