Hacker News new | past | comments | ask | show | jobs | submit login

I have been full time in C++ development for >15 years. I can count the number of times I had memory correctness issues that weren't discovered and fixed trivially on one hand, all of them very early in my career. As your adage, of course mine doesn't prove anything.



I don't want to sound like I'm downplaying your carreer, but what kinds of projects did you work on for 15 years to have an experience this positive?

For instance I find it incredible that you never had to debug and workaround a third party vendor DLL that you didn't have source for but that was leaking memory like crazy. This is the just one example of something that can be "fixed trivially" as you don't have source to modify, and is extremely common in some fields (i.e. embedded)


I have been working on a monorepo C++ codebase with on the order of 1000 person years of development on it.

And no, we didn't have major issues with memory leaks either. About on par with what I have seen in garbage collected languages. RAII works quite good most of the time.


I take it that your project being a monorepo means you have access to the code and can change it when deemed neccessary?

I get how that would be characterized as "trivially fixable", but I assure you that this is not the kind of projects people complain about when they discuss memory safety issues.

Consider that some people need to send emails to vendor companies begging them to stop segfaulting, writing the stack or leaking memory. You're lucky if it gets fixed in a few months, because that means you wouldn't have to seek alternatives which would be even more time consuming. In conclusion, there are many people out there dealing with memory safety issues which are anything but "trivially fixable".


See now what I mean by calling this a trade-off? Not everyone is in the same boat. I don't understand how this can even be a controversial stance.


Microsoft and Firefox have cited around 70% memory bugs, and they probably have tooling and whatnot. There are a few languages with good C/C++ FFI that are a better choice for memory safety, and so the tradeoff there isn't very high. I grant there may be ABI edge cases or whatever, but C/C++ is no longer viable or necessary for a good portion of software.


Well 3rd party libraries that are sloppy may be fixed by a psychological method…if you had an autotest environment that fuzzed the inputs and checked for correctness, you could label each library as “robust” or “weak” and leave it at that.

Then people could decide which ones to use based on the label alone. This would be an incentive for people to fix their libraries.

Then the process of normal attrition would take care of all the sloppy libraries.

Evolution at its finest.


I've been working in a similar environment, and once a month (or more) someone comes to me with a crash they don't understand. It usually takes a couple of days to debug (since the easy bugs don't make it to me), and every single one is some kind of undefined behavior (temporary lifetime confusion, dangling reference, callbacks modifying containers during caller iteration, ...)


The thing is, you can't know that unless you go out of your way to exploit your own code.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: