That isn't really an accurate description of the issues: it's not the case that it was "working" and then "broken" by GCC maintainers. It's always been unsupported and not worked in specific situations, but for 99% of code it appeared to GCC users that it was supported. The signed integer overflow behaviour was never guaranteed by old versions of GCC, and code exploiting it would not always be compiled with the "expected" behaviour. It's just as the GCC optimizer has improved there are more circumstances when it does optimisations that hinge on the assumption.
Compiler users generally want something that "just works" and doesn't do anything unexpected, but in the case of a low-level language like C, doing away with undefined behavior essentially would mean pessimistically avoiding many optimisations on the 99%+ of straightforward, reasonable code out there in favor of not doing anything surprising on the remaining fraction of dubious code that depends on certain things happening in scenarios where behavior is undefined according to the C standard. There are languages that make that choice, but C isn't one of them.
That is my point. De facto working code is not working code.
I have fixed feelings about the integer overflow issue because it's so easy to trigger, unlike triple post increment fake examples. And it usually results in a security problem. For very little benefit, IMO.
Compiler users generally want something that "just works" and doesn't do anything unexpected, but in the case of a low-level language like C, doing away with undefined behavior essentially would mean pessimistically avoiding many optimisations on the 99%+ of straightforward, reasonable code out there in favor of not doing anything surprising on the remaining fraction of dubious code that depends on certain things happening in scenarios where behavior is undefined according to the C standard. There are languages that make that choice, but C isn't one of them.