No, it is not, and it has never been OK for as long as C has been standardized by ANSI or ISO.
C89 says that "The value of a pointer that refers to freed space is indeterminate." and that behavior is undefined "upon use ... of indeterminately-valued objects", hence a compiled program can e.g. behave as if `new_pointer == old_pointer` even though the object was relocated in memory.
So if we print out the pointer before reallocation then they're not equal, but if we don't then they are equal.
Funny enough, "-fsanitize=undefined" doesn't seem to detect this. Neither does "-fsanitize=address" (but with ASAN the results are now consistent and in both cases compare to not equal).
That standard says free takes a void*, i.e. it takes the pointer by value, and that passing values by value to a function prevents the callee from changing them.
More literally, your pointer value is copied into a register before the call into libc, so free can't change the value even if it wants to. Realloc can't either for the same reason.
That provenance has come up with the premise that a ! = a for a 64bit integer value is an error in the formalism, specifically the language is deliberately inventing things that cannot be so on hardware.
Note that (a) pointer representations can contain bits representing the provenance that don’t participate in equality comparison, and (b) a “C implementation” in terms of the C standard includes the compiler, which means the compiler is allowed to track pointer provenance statically even when the representation in memory doesn’t differ (e.g. exactly what GCC is doing here). In addition, a C implementation is also allowed to track pointer provenance externally at runtime, again even if the pointer representation (as exposed by memcpy into a char buffer) is the same.
To the extent that your compiler has embraced the provenance model, yes. But that does mean that you can't assume that the language pointer is the one the machine gave you and therefore probably can't write code that does anything with pointers other than pass them around as opaque handles. No testing their alignment, no comparing them to each other.
> specifically the language is deliberately inventing things that cannot be so on hardware
So what? The standard has "undefined behavior"; real implementations always do something, even if that something cannot be determined in advance. The standard is the standard, it's not a machine.
Well, at some point in history C was about telling a physical machine what to do. You can see that in the operations it exposes.
I don't know what modern C is for. The one that manipulates an abstract machine with concurrency oracles and time travelling metadata on object identifiers. It looks like an aberration derived from C++ to me.
This is legal. But dereferencing old_pointer even after this check had passed is undefined.