Its far more likely for a major compiler to exploit more undefined behavior for optimizations than for a processor with a bizarre sized byte to become popular. The major compilers already do this.
> The reality is unless you are using a formally-defined language like ML, you are relying on undefined behavior in your language.
This is not true, not using the definition of "undefined behavior" provided in the C standard.
> The reality is unless you are using a formally-defined language like ML, you are relying on undefined behavior in your language.
This is not true, not using the definition of "undefined behavior" provided in the C standard.