Hacker News new | past | comments | ask | show | jobs | submit login

The last benchmark is interesting:

> 1080p60 HEVC at 3500 bitrate with "Fast" preset - 319 fps

Ok, why were these parameters chosen? What's the application? I recommend everyone to look at 1080p60 video footage encoded with h265 with the "fast" encoder preset at 3500 bitrate. Calling it terrible would be a compliment. Unless you encode really slow and visual easy motion, which brings up the question why you would need 60 fps in the first place. Even at the "medium" preset with 1080p60 you should - regardless of application - be at least in the 5000+ range with your bitrate. And even that comes with a lot of trade offs, because that's just where live streaming starts.




I believe most of these benchmarks were set at a time when you simply couldn't run x265 on “slow” on any reasonable CPU if you ever wanted it to complete. But yes, I'd really like CPU benchmarkers to move to higher-quality presets for video encoding, because they do tend to have different kinds of performance curves.

Fun fact: There's no point in running x265 on the fastest presets unless you absolutely need to have HEVC; x264 on slow is faster _and_ gives better quality per bit. See the second graph on https://blogs.gnome.org/rbultje/2015/09/28/vp9-encodingdecod... (a few years old, the situation is likely to look similar but not identical).


>(a few years old, the situation is likely to look similar but not identical).

x265 has made massive improvement over the years, 2015 x265 wasn't even considered good; despite all of its hype, or another way to think about it is how well x264 managed to squeeze every last bit of detail possible.


IIRC, I redid this graph in early 2019 (using Tears of Steel), and it looked pretty similar.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: