> Ideally the linker would notice the ODR violation, but doing so ends up being expensive – you have to look into every object file to see if any of functions are defined in different ways, and the definition of ‘different’ is not obvious.
For this overhaul you mention, we just need to eliminate "undefined behavior" from our C and C++ implementations. Just. Or permit that [time-]expensive operation to run. Or create some new languages that specify everything so that there is no "undefined behavior."
For the first option (removing undefined behavior), every attempt is met with code that relies on a particular compiler's implementation of "undefined" for a particular operation. For the second (run the expensive operation), the option needs to be available in the linker (I have no idea whether it is) and programmers need to have the patience to let the tools run. For the third, see all the new languages upon which much work is done, especially in the LLVM community (Julia, Rust, Swift).
I believe the renaissance you want is underway. Just not in C/C++ land.
The problem with generating ODR violation diagnostics is that it requires the linker to compare all the duplicate definitions to each other not on bit-by-bit basis, but somehow discerning whether they are generated from same input source or functionally equivalent, without embedding some hairy metadata in the object file this seems like something that reduces to halting problem.
This is also one of the many problems that simply ceases to exists once you move away from the C/C++ model of "preprocessor as module system".
I believe that GCC, when running on LTO mode, embeds exactly this kind of 'hairy metadata' (IIRC the gimple code for the function) and is capable in principle of identifying ODR violations. I have to try this capability myself though, so I don't know how effective it is.
edit: reading the docs again, it seems that it only detects ODR violations of the layout of types (and vtables).
I seem to remember that VC++ has a (possibly undocumented?) option to detect ODR violations. It might only work on LTCG builds where functions are stored in an intermediate representation, prior to optimization.
But, I can't find any references to it, so maybe I am imagining.
This detects differing structure layouts between compilation units which is simple to do given some metadata in the object files. On the other hand it also shows why detecting ODR violations for executable code is non-trivial, because same source code compiled with different compiler flags can lead to different output that might or might not be compatible (enforcing same compiler configuration for all compilation units is certainly non-starter, think JS runtime vs. FFmpeg, 3D renderer vs. physics simulation...).
> Ideally the linker would notice the ODR violation, but doing so ends up being expensive – you have to look into every object file to see if any of functions are defined in different ways, and the definition of ‘different’ is not obvious.
For this overhaul you mention, we just need to eliminate "undefined behavior" from our C and C++ implementations. Just. Or permit that [time-]expensive operation to run. Or create some new languages that specify everything so that there is no "undefined behavior."
For the first option (removing undefined behavior), every attempt is met with code that relies on a particular compiler's implementation of "undefined" for a particular operation. For the second (run the expensive operation), the option needs to be available in the linker (I have no idea whether it is) and programmers need to have the patience to let the tools run. For the third, see all the new languages upon which much work is done, especially in the LLVM community (Julia, Rust, Swift).
I believe the renaissance you want is underway. Just not in C/C++ land.