My point is that it is humanly impossible to write a bug-free program. In C and C++, bugs usually manifest themselves in UB.
To make matters worse, compilers, ever searching for diminishing returns on performance improvements, have been steadily making the CONSEQUENCES of UB worse, to the point that even debugging is getting harder and harder. These languages are unique in their growing user hostility.
Lint was created in 1979, during the mid-90's we already had Insure++, Purify and others, ages before of the free beer alternatives.
Yet being free still doesn't help the large majority of C and C++ to actually use them, as proven at one of CppCon talks, where a tiny 1% of the attendees confirmed using any kind of analysers.
One of the most eye-opening papers in this regard was the integer overflow checking paper (which Regehr was coauthor on), which found that every library tested invoked undefined signed integer overflow. This includes SQLite, infamous for its comprehensive testing, and various libraries whose sole purpose was to check if an operation would overflow without invoking undefined behavior.
Belief that you are skilled enough to write C/C++ code that doesn't exercise undefined behavior either shows that you don't know what is undefined behavior or that you believe you are the best programmer to have ever lived.
There hasn't been much in terms of changes. Languages immune to classes of vulnerabilities by default and/or sound checkers that can catch all of them seem necessary. And by seem necessary, I mean massive, empirical evidence that most developers can't code safely without such alternatives even on critical, widely-used, well-funded projects.
Well don't use it. You aren't the target customer because for me, fixing the performance bottleneck is a lot harder than finding a divide by zero, and I certainly don't want to pay for the compiler to check things like divide-by-zero without me asking it to. When I don't care about performance, I reach for a scripting language or something. It's a tool, don't get all emotionally worked up about it.
We are entitled to be emotional about it, because we all have to use tools which have the misfortune to be written in C derived languages.
Even if I don't touch those languages for production code, my whole stack safety is dependent how well those underlying layers behave, and how responsibile the developers were towards writing secure core.
Which as proven by buffer oveflows in IoT devices not much.
Sure it can, but that wasn't the question. Rather what happened to the code I have written.
NDAs make us not able to tell about stuff.
What magic variant of C or C++ compiler are you using that throws errors on buffer corruption, unless you are speaking about using code with debugging mode enabled in production instead of using a proper release build.
> buffer overflows are caught by essentially every modern compiler.
If you use high-level arrays with bounds-checking, which are not always fast enough, and can become a maintenaince burden. If they should be absolutely secure (like, std::vector isn't - it can be moved behind your back), they also require GC.
To make matters worse, compilers, ever searching for diminishing returns on performance improvements, have been steadily making the CONSEQUENCES of UB worse, to the point that even debugging is getting harder and harder. These languages are unique in their growing user hostility.