Its in the same ball park. Both do considerably better than AVC (h264), but many direct comparisons between HEVC (h265) and AV1 compare apples to oranges. Sure you can get 30% lower bitrate, but only at degraded quality levels or higher decode complexity.
Also note that HEVC had a considerable head start (5 years?) so performant encoder (or even energy efficient decoders) took a while to catch up. Recent ffmpeg versions offer a lot of options, you'll find that even a basic comparison is PhD-level difficult ;-)
> Sure you can get 30% lower bitrate, but only at degraded quality levels or higher decode complexity.
Thank you for pointing this out. This thread is a mess of claims at the moment because this simple fact is under-recognized.
There are two accepted ways to compare codecs+settings: either (a) you perform a subjective comparison with the human eye using the same bitrate for both codecs, or (b) perform an "objective" metrics-based comparison where you match measured quality and compare the ratio of the bitrates.
If you're looking only at 1080p SDR 8-bit video, even h264 is already commonly used at bitrates that can approach transparency to the source (visually lossless to the human eye) when encoded well. For example, a typical Blu-ray bitrate of ~30 Mbps can achieve transparency when well-encoded for most sources.
The reason measures like "30%" are misleading is that if you try to match h264 performance at these high bitrates, you won't get anything close to 30% improvement (with HEVC over h264, or AV1 over HEVC). It can be negligible in a lot of cases. In other words, the improvement ratio from increasing the complexity of your media codec depends on the target quality of the encodes used in the test.
AV1 achieves significant improvements ("30%") over HEVC only at the lowest qualities, think YouTube or Twitch streaming. At high bitrates, e.g. something acceptable for watching a movie, the improvement can be much less or even insignificant, and at near-transparency a lot of AV1 encoders actually seem to introduce artifacts that are hard to eliminate. AV1 seems heavily optimized for the typical streaming range of bitrates, and claims about its supposed improvement over HEVC need to be understood in that context.
Depends on the encoder, this website provides easy-to-visualize data sets for various encoders at various settings
https://arewecompressedyet.com/
AV1 encoders tend to have better VMAF score at a given bits-per-pixel.
Wow that was unexpected. I checked online and it does say production encoders are faster and the result is somewhat smaller (for same quality). What a time to be alive.