Hacker News new | past | comments | ask | show | jobs | submit login

In addition to what others have said, I remember reading somewhere that CPUs give more reliably accurate results, and that that's part of why they're still preferred for pre-rendered content



> I remember reading somewhere that CPUs give more reliably accurate results

This is no longer true, and hasn’t been for around a decade. This is a left-over memory of when GPUs weren’t using IEEE 754 compatible floating point. That changed a long time ago, and today all GPUs are absolutely up to par with the IEEE standards. GPUs even took the lead for a while with the FMA instruction that was more accurate than what CPUs had, and Intel and other have since added FMA instructions to their CPUs.


I believe this to be historically true as GPUs often “cheated” with floating point math to optimize hardware pipelines for game rasterization where only looks matter. This is probably not true as GPGPU took hold over the last decade.


Ah, that makes sense




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: