Hacker News new | past | comments | ask | show | jobs | submit login

Because it requires some kind of fat binaries.

Some C and C++ compilers offer this and it requires some infrastructure to make it happen (simd attribute in GCC), or explicitly loading different kinds of dynamic libraries.




I found most precompiled C++ libraries target a specific minimum microarchitecture because it swells the binary size if you do fat binaries, and for most users they wouldn’t know anyway. Companies always get complaints when they bump up the minimum though - I remember reading some articles about some games requiring SSE4.1 or AVX and gamers complaining they couldn’t play it on their ten year old machine.


Not playing on their 10 year old machine is the main reason why OpenGL 3.3 keeps being the baseline when using GL.

Although maybe GL 4.1 would do it nowadays.


Rust already has runtime CPU feature detection. But that would require hand writing the SIMD code.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: