Hacker News new | past | comments | ask | show | jobs | submit login

The language could make those guarantees if it wanted to. This might add overhead on some architectures, but would be possible. An example is integer overflow. If we limit to machines using two's complement (thus any machine architecture used during last thirty years) this could be fully defined easily. And if C is ever used on a other architecture they could build a workaround using some form of oberflow trap. (Since CPU design would take C into consideration)

Or evaluation order - `i = ++i;` could easily be defined in some way. But might prevent some niche optimisations by the compiler.

Of course by C's nature there are limits (C won't be able to detect use after free or similar without changing language notably) but there is room where UB could be reduced, if it was seen as neccisary.




> "The language could make those guarantees if it wanted to"

> "Of course by C's nature there are limits"

You seem to acknowledge the fact that most of the undefined behaviors in C are essentially born out of compromise. Those compromises were driven by principles such as "keep the power in the hands of developers", "don't impose unnecessary restrictions", "keep the language clean", "avoid hidden runtime magic". The end results reflect that.

As I've mentioned in my previous comment, there's no "one size fits all", so the language makes it trivial for you to roll out your very own runtime magic (à-la Zig/Nim) which suits you best. Why is that a bad thing?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: