> Most of the video you encode on a computer is actually all in software/CPU because the quality and efficiency is better.
I don't think that's true. I bought a Thinkpad laptop, installed Linux and one of my issues was that watching youtube video put CPU onto 60%+ load. The same with Macbook barely scratched CPU at all. I finally managed to solve this issue by installing Arch. When everything worked as necessary, CPU load was around 10%+ for the same video. I didn't try Windows but I'd expect that things on Windows would work well.
So most of the video for average user probably is hardware decoded.
>>> It can generally handle most decode operations with CPU help and a very narrow encoding spec.
This is so much spot on. Video coding specs are like a "huge bunch of tools" and encoders get to choose whatever subset-of-tools suits them. And than hardware gets frozen for a generation.
>Most of the video you encode on a computer is actually all in software/CPU because the quality and efficiency is better.
That was the case up to like 5 to 10 years ago.
These days it's all hardware encoded and hardware decoded, not the least because Joe Twitchtube Streamer can't and doesn't give a flying fuck about pulling 12 dozen levers to encode a bitstream thrice for the perfect encode that'll get shat on anyway by Joe Twitchtok Viewer who doesn't give a flying fuck about pulling 12 dozen levers and applying a dozen filters to get the perfect decode.
It’s not all hardware encoded - we have huge numbers of transcodes a day and quality matters for our use case.
Certainly for some use cases speed and low CPU matter but not all.
Not sure why downvoted, all of serious Plex use runs on hardware decode on Intel iGPUs down to an i3. One only sources compute from the CPU for things like subtitles or audio transcoding
Because Plex and gamers streaming is not the only use case for transcode