Hacker News new | past | comments | ask | show | jobs | submit login

What do you mean by "work"? If you assign a default parameter value, you never need to mention undefined or null.



'work' as in if you explicitly pass in undefined as a parameter to a function, the default parameter will be used in it's stead.

I've never ever run into this because I never use undefined for anything other than stuff that's explicitly not set, and I have no fucking idea why other people are. But I can definitely see it causing bugs if you do stuff like use '!=' everywhere and treat null and undefined as the same thing.

This is why the original comment is wrong. You should be narrowing your types where possible. It costs nothing to just always use null if you're explicitly setting something, then you can use '!==' wherever you want, which means that if someone else sets something to undefined you'll catch the bug straight away when it's easiest to catch.

If you treat them the same you're just asking for some 3rd party lib to do a '!== null' or accidentally overwrite a "null" with a default parameter you're not expecting, and that's way, way harder to troubleshoot.


I've never run into this, because I never explicitly pass an expression when I want the default parameter value to be used.

You could just as easily run into a bug with 3rd party lib code by distinguishing between the two when that library doesn't.

The only reason I've ever intentionally used `null` in my own code is that it precedes `undefined` in the default array sort. But if I ever found myself using library code that required `null` for something, I'd say "Huh. That's weird." And then I'd pass it a `null`.


> I've never run into this, because I never explicitly pass an expression when I want the default parameter value to be used.

What? That's not what prevents you from running into bugs. The bugs come when you treat null and undefined as equivalent and try to pass them into a function with default params, without knowing which one you have. Because the defined behaviour of default params is different for null and undefined.

For example, say you have a data type of 'null or [0...11]' for a month input the user may or may not have interacted with yet. It gets passed around a bit and ends up going through a function defined with default params, like 'function doSomethingWithDate(day = 0, month = 0, year = 1900)', and in that function there's a null check like 'if day, month or year is null, do whatever we want to do if the user hasn't entered a date yet'.

A few months later someone picks up a ticket to refactor the date input, and they use 'undefined or [0...11]' for the month instead, which doesn't matter because your team always just checks both with things like '==', '!=' or '??'.

Except surprise, it does matter, because now something somewhere else breaks because it thinks the state is 'the user has entered 1/1/1900' instead of 'the user hasn't entered anything yet'.

Of course you might not run into this specific bug, because the code is a bit contrived, and most good programmers will recognise that whatever 'doSomethingWithDate' is it probably doesn't need to be run unless the user has actually entered a date already. That null check should be happening as close to the user input as possible, and there shouldn't really be any room for people to use default params between those two places.

The problem is that in the real world people write dogshit code like this all the fucking time. That would rate at about the 5th percentile of all the garbage code I've seen in my career.

My point is that that's literally impossible to run afoul of language differences between null/undefined if you're strictly using either null or undefined, as opposed to this blase mindset of "just always check for either", because you now know what your data type is while you're writing the code, and can look up the standard library or 3rd party library docs to see how they interpret it. If you have data types of 'null or undefined or ...' then you don't know which one you've got until runtime and you need to account for both possibilities everywhere. That means coercing it into either specifically null or undefined every single time you want to pass it to a library function, your own codebase's functions that use default params, or any other place the language itself treats undefined and null in different ways.

Always use the minimal data type for representing whatever you need. If you don't then you're literally choosing to have ambiguity in your code base. It makes no sense.

Also this:

> You could just as easily run into a bug with 3rd party lib code by distinguishing between the two when that library doesn't.

...is nonsense. If the library doesn't distinguish between the two then by definition it cannot matter which one you pass in, since it's treating them the same.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: