SVT VS AOM AV1

Is there a consensus on what the best AV1 encoder is? SVT had a proper release relatively recently and shows promise in scalability, but it still seems pretty limited in its capabilities. Threading is also moot when dealing with long videos that benefit form scene splitting, as Av1an can easily saturate all CPU cores, even with the absolute slowest settings in aomenc.

Attached: av1-logo-cmyk.jpg (1024x568, 44.46K)

because aom is the reference encoder, i guess it will never get useful encoding speeds

AV1 is a rush effort by big tech streaming to get a barely passable royalty free codec out the door ASAP
stick to HEVC

but if you want to be serious, I've seen people use mostly rav1e and svt-av1 with or without av1an

HEVC lost.

>Only comes out on top when encoding with Youtube quality
Oh no no no no.

Attached: svt.png (938x625, 32.98K)

LOLNO. While the slowest AV1 encoding is ~30% more efficient than RTX NVENC HEVC encoding, the latter encodes 1080p video at 100+ FPS on a laptop. In the end HEVC will still have exponentially more hardware decoding than AV1 will for the next decade (ie phones/tablets) so it's not going anywhere for consumers.

Attached: j1Of2Ou (1).jpg (2667x1500, 385.94K)

>NVENC

Attached: team america.gif (216x198, 870.06K)

Bro, Intel's hardware AV1 beats Turing NVENC, though. I mean you're right, or were, before Intel's hardware AV1 encoders on their new GPUs were tested.
And Pixel 6/Pro/A has hardware decoding, withthe next generation of snapdragons also getting it.

Not to mention that hardware support is only one side of the story. Even my ancient phone can play AV1.

I wouldn't want to destroy my phone's battery life just for a little bitrate savings, but sure. I think the PS4 Pro defaults to AV1 for Netflix, even though it obviously doesn't have hardware decoding; fuck your power bill, Netflix needs more money, right?

It doesn't need hardware DRM for Netflix?

No it doesn't, SVT mode 8 does (barely). Intel's AV1 encoder is on par with the slowest x264 preset but RTX nvenc outperforms it. In fact sw x265 is only around 10-20% more efficient on the slowest preset USING 2-pass encoding which is fucking nuts for a GPU video encoder when you think about it. RTX NVENC has effectively made software video encoding nearly obsolete.

YOU LITERALLY CAN'T PLAY 4K30FPS AV1 ON 90% OF PHONES EVEN ON SW.

NVENC became AMAZING when RTX series launched. Nvenc now encodes b-frames, look it up.

Least obvious MPEG shill

>"just watch 240p AV1 video streams on your phone bro, who cares that it looks like it came out of a dogs asshole, at least you're sticking it to the man, hh-ha-ha!"

Oh shit, you're right, or might be. I can't find a single fucking benchmark of actual h265 NVENC vs Intel's AV1, and the one I thought was it said NVENC but in fine print clarified it's 264.
But for online streaming online, It IS only AV1 vs 264, because no website will ever use 265, so that's moot. But for other uses and personal gamestreaming, maybe Turing NVENC 265 is still king.

>Watching 4K on a tiny phone screen

>But for other uses
Might as well look towards VVC for torrent shit now. It's not like HEVC was widely supported in software anyway.

Yeah. Although I hope HEVC 10-Bit doesn't later turn into AVC 10-Bit, in that later hardware drops hardware decoding because nobody used it. I mean what legit use is there for HEVC on Android SOCs? Do people plug blu-ray players into their phones? No... So they might say "fuck pirates" and drop hardware dec. AVC 10-bit anime from Nyaa can't be played on almost anything in software at full speed, at least not on mobile devices. You have to go re-encode it all as HEVC 10-bit.
Also if we're talking local gamestreaming or wireless vr, for lot latency I don't know if HEVC uses b-frames, so maybe AV1 hardware encoding does beat Turing, I mean you sure as fuck aren't doing 2 pass either for realtime.

What's the VMAF score for the 30 series?

I think they all use Turing's NVENC, so probably the same. Also I don't believe any of their ARM64 SBCs have actual NVENC, which sucks.

In that case it's still better to use x264 veryslow. NVENC can miss a lot of times.

Attached: bench.png (1196x671, 174.47K)

But you see this is just talking about using NVENC for streaming; it leaves out what makes NVENC Turing amazing, and that is 10-bit H265 2-Pass with B-Frames. For Twitch or whatever, H265 was never an option so it's software x264 vs Intel's AV1, and nVidia's NVENC 264 can never compete with those two, although it would free up your CPU vs x264.

>H.264
shitty bait