Hacker News new | past | comments | ask | show | jobs | submit login

Its far more likely for a major compiler to exploit more undefined behavior for optimizations than for a processor with a bizarre sized byte to become popular. The major compilers already do this.

> The reality is unless you are using a formally-defined language like ML, you are relying on undefined behavior in your language.

This is not true, not using the definition of "undefined behavior" provided in the C standard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: