Eh? I thought that would only be "legal" if it was specified to be implementation-defined behavior. Which would, frankly, be perfectly good. But since it is specified as undefined behavior, programmers are forbidden to use it, and compilers assume it doesn't happen/doesn't exist.
The entire notion that "since this is undefined behavior it does not exist" is the biggest fallacy in modern compilers.
The rule is: If you want your program to conform to the C Standard, then (among other things) your program must not cause any case of undefined behavior. Thus, if you can arrange so that instances of UB will not occur, it doesn't matter that identical code under different circumstances could fail to conform. The safest thing is to make sure that UB cannot be triggered under any circumstances; that is, defensive programming.
Where does that myth come from!? According to the authors of C89 and C99, Undefined Behavior was intended to, among other things, "identify areas of conforming language extension" [their words]. Code which relies upon UB may be non-portable, but the authors of the Standard expressly did not wish to demean such code; that is why they separated out the terms "conforming" and "strictly conforming".
I don't think it's a myth so much as a misunderstanding of terminology. If an implementation defines some undefined behavior from the standard, it stops being undefined behavior at that point (for that implementation) and is no longer something you need to avoid except for portability concerns.
You're exactly right that this is why there is a distinction between conforming and strictly conforming code.
The problem is that under modern interpretation, even if some parts of the Standard and a platform's documentation would define the behavior of some action, the fact that some part of the Standard would regards an overlapping category of constructs as invoking UB overrides everything else.
I could imagine misguided readings of some coding standard advice that would lead to that interpretation, but it's still not an interpretation that makes sense to me.
Implementations define undefined behavior all the time and users rely on it. For instance, POSIX defines that you can convert an object pointer into a function pointer (for dlsym to work), or implementations often rely on offsets from a null pointer for their 'offsetof' macro implementation.
Such an interpretation would be the only way to justify the way the maintainers of clang and gcc actually behave in response to complaints about their compilers' "optimizations".
The entire notion that "since this is undefined behavior it does not exist" is the biggest fallacy in modern compilers.