Hacker News new | past | comments | ask | show | jobs | submit login

> Assuming there is no UB and optimizing under that assumption gives a speedup.

Sure, but to make sure you do not have any UB, at least for things like signed arithmetic, you basically have to either give up signed arithmetic, which is what I've done by using only unsigned, or ensure your arithmetic is always within range, which requires formal methods. Otherwise, you have UB, and that 1-2% speedup is because of that UB. If you do what I do, you get no speedup. If you use formal methods to ensure ranges, you do, but at an enormous cost.

> But in no way does making the assumption that your code has no undefined behavior rely on your code having undefined behavior.

But it does. Because the optimizations that assume UB only kick in when UB could be present. And if UB could be present, it most likely is.

Once again, those optimizations will never kick in on my code that uses purely unsigned arithmetic. So yes, those optimizations do require UB.




Is it formal methods to conclude that the vast majority of arithmetic, which for the software I typically write consists of iterators and buffer offsets dealing with fixed size buffers, can never overflow an int? Is it formal methods when I conclude that the 12-bit output from an ADC cannot overflow a 32-bit int when squared? Is it formal methods to conclude that a system with 256 kilobytes of RAM is not going to have more than 2^31 objects in memory? I guess I do formal methods then, at an enormous cost. Sometimes I have runtime checks with early bail outs that allow code later down the stream to be optimized.


There are always exceptions, of course.

But then again, there's also the Ariane 5.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: