Hacker News new | past | comments | ask | show | jobs | submit login

I've always felt like "memory leak" wasn't the proper term for GC'ed languages.

If you forget to free a malloc'ed buffer, yeah, you're leaking memory.

Here, in the example given, well, the developer would be doing something stupid (i.e. keep track of something created at each request in a global variable). This isn't a leak per se.

I mean, it's not a memory management bug you can fix: it's a behavior that's inherently not sustainable and that will need to be changed.




I think perhaps you're being overly pedantic, missing a free, or leaving a reference to no longer used memory has the same result for the application, the heap grows and eventually causes problems.

I run a distributed embedded Linux systems, and we use both C and Lua which is GCed. We track heap use from both Lua and C daemons for pretty much exactly the same reason. Both C and Lua apps can leak and I need to know if we've got a problem in field.

Our biggest problem in C is Jansson and keeping track of all the borrows vs steals of references. Lua is so dynamic it's sometimes easy to lose track of someone holding a reference for too long or forever. When you mix in closures, sometimes it's not always clear what the lifetime will be, especially if the result or error never happens.


I understand your position, but often the memory leaks in your runtime are not your fault — https://www.joyent.com/blog/walmart-node-js-memory-leak

It’s difficult to find and diagnose these types of issues, so anyone looking into RAII style leaks, or long lived closures, need tools to help them understand their allocations and their life cycle to be able to solve their problems.


GC theory starts with "dead objects should be collected", and then immediately retreats to "unreachable objects should be collected" since unreachable objects are dead.

But some objects are reachable, and also dead. My progress spinner advances on a timer, but nobody will ask me to draw. I'll invalidate my bitmap cache on screen resolution changes, but nobody cares about my bitmap.

Most GCs would rather leak than introduce memory unsafety. But their design biases towards leaks: weak references become this weird expensive path and that's why you don't use them routinely.


Collecting exactly the dead objects in all situations is not possible even in theory:

    function f() {
        x = Object(1)
        if (halts(some_program)) {
            return Object(2)
        } else {
            return x;
        }
    }
The contents of x will be dead or not after the assignment depending on whether some_program halts. But a GC that could determine that could solve the halting problem.


That code doesn't make any sense. Either halts() itself is a function that solves the halting problem or the else branch is a dead branch because in order to be selected halts() must have never returned.

Dynamic code is where the GC can't know if something simply has a stale reference or if the codepath that calls it simply doesn't exist yet. It's be interesting to need to tag such data appropriately as that opens up a lot of leak protection.


Yep, you're right. I should've just said that the condition of the if statement can in general be undecidable.


I think OP put that example just because he/she wanted to explain a very simple scenario.

takeway from this blogpost is getting heap dumps from prod server and analyse them with chrome dev tools by comparison.


Memory leaks can certainly be more sinister than the example used here.

It's not possible to never use global state — it makes sense in some cases. And then you end up with harmless things like adding memoization leaking memory when not done properly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: