Hacker News new | past | comments | ask | show | jobs | submit login

Trying to finely parse the semantics of null and undefined like this is a fool’s errand, insofar as there isn’t much in the language which enforces your conventions. Also, these conventions fall apart when we check how null is actually used. For instance, all DOM nodes have a default of null for their first children and next siblings. Does this mean “the value exists but is not set?” What does it mean when the browser is setting the value and not the user?

At the end of the day it would have been much nicer for there to only have been one nothing type but what’s done is done.




Element.foo being undefined says “foo? What’s foo? I don’t know what you’re talking about.”

Element.{firstChild, nextSibling} being null means “I know what firstChild (or nextSibling) is, but this element doesn’t have one.”

It would be categorically wrong for things like Element.{firstChild, nextSibling} to be undefined.


Some developers would interpret your rule as being, if it doesn’t make sense for you to make this property access on this specific object, the value should resolve to undefined and not null. Problem is, there already are mechanics for a kind of check like you describe (the in operator and hasOwnProperty both kinda do what you’re looking for). Moreover, if you were to design the DOM API from scratch, by your logic some developers might assume that text nodes, which never have children, should have a firstChild set to undefined and not null, because it should say “I don’t know what you’re talking about.” You can debate this last point, but know that some people are going to interpret the null/undefined rules as you describe in this way.

I’ve seen null/undefined semantics play out; they more or less collapse under the slightest of deadline pressures or scaling of team sizes. There are infinite ways to splice the idea of “emptiness” or “lack of data” and any definition you provide (the definitions both you and OP provide give a lot of wiggle room) would be interpreted fifteen ways by fifteen developers, so much so that even changing hands just once will cause any null/undefined distinctions to sublimate in any codebase.

There are so many languages which do just fine with one null type; I don’t understand why JavaScript developers have to do these sorts of mental kinesthetics to justify the existence of two in JavaScript, especially given the more or less insane fact that `typeof null === "object"`. If you consistently use undefined or null exclusively internally, and accept both via the == null trick externally, you can pretty much avoid having to deal with null/undefined semantics entirely.


The early DOM APIs certainly have some warts due to inheritance design mismatch, because they designed things in a single-inheritance way despite DOM actually being multiple-inheritancey. These days, new things like the append() method get added to the ParentNode interface, so that they’re provided by Element and DocumentFragment, but not such as Text or Comment; but back when they designed these things at the start, they only had Node, and decided to implement these methods on Node rather than individually on Element and DocumentFragment. While I imagine there were practical reasons they did it that way then, with the benefit of hindsight I state categorically that this was a mistake and is a design error. If the spec- and browser-makers were willing to break backwards compatibility, firstChild would certainly be shifted from Node to ParentNode, and text nodes would no longer have a firstChild member.

> There are so many languages which do just fine with one null type; […] two in JavaScript

I quibble over your claim here, because JavaScript doesn’t have two null types. undefined is not a null type. Rather, it’s JavaScript’s equivalent of what is in most dynamically-typed languages an exception, and a compilation error in most or all statically-typed languages. As a concrete example, where JavaScript produces undefined, Python raises AttributeError or KeyError. There are similarities to NaN as well—both come from a philosophy of trying to keep the code running even if you ask for something that isn’t defined. Yes, undefined then behaves a lot like null in some ways, just as NaN behaves like a number in some ways (though the two cases are not parallel, merely passingly similar). But then, false behaves like null in most of the same ways too (though not quite all—most notably, the `== null` comparison—but it’s well understood that a lot of JavaScript’s == comparisons are illogical and inconsistent, so this should not be considered significant to the design); yet no one claims false to be a third null value.


Why does Array.prototype.find() return undefined and not null if it didn't find a value? According to your logic, the sentinel to indicate "this array doesn't have an element that matches the predicate" should be null.


I don’t know. I would guess that that return type involved a slightly heated discussion before undefined won over null so that you can distinguish it returning an element that is null.

(This is a fundamental problem of nullable types which is solved by algebraic data types; in Rust, the equivalent method would return None for “no value found”, or Some(None) if it found a None value.)


Also, the "wisdom of the Ancients" in JS was to never intentionally set a value to `undefined`. While a lot of the quirks have been paved over by TC39, `undefined` was originally considered an implementation detail of the JS engine and had different semantics in different engines, especially when trying to set something to `undefined` (`myobject.field = undefined` might be equivalent to `delete myobject.field` in one engine, equivalent to setting to `null` in another engine, and its own explicit primitive value in a third, while a fourth threw an error when setting anything explicitly to undefined because it was not a value at all). Even with paved over semantics and a general convergence among JS engines (and a near monopoly of V8 in practical usage), I still find it worrisome seeing codebases that treat undefined and null as very distinct values instead of shades of gray of the same concept, because `undefined` certainly wasn't meant to be a value originally.


There has been no version of javascript where it was practical to avoid setting a variable to null. As far as I know, things like these would always do it.

    var x = {}.foo;
    var y = (function() {})();
These are contrived examples, but they represent things done by very reasonable code. There are all kinds of ways that `undefined` can get assigned into a variable. It's inevitable that those cases would have to be handled.


I'm saying it impractical to consider `null` and `undefined` as different "values" in JS. `undefined` was originally built to be a "thrown" exception built for a language without thrown exceptions. It wasn't intended to be a placeholder value like null is. The language today makes it possible, you can today write `var x = undefined` and expect things not to blow up or behave all that differently/quirkily between browsers and browser modes. I'm still going to be worried/skeptical of any code that intentionally uses patterns like that of setting objects and properties explicitly to `undefined` and treats that as a different value from `null`.

Yes, you have to handle cases where things are undefined, but I'd be wary of handling them too dissimilarly to where things are null, because `undefined` was not designed to be "Null 2: Electric Boogaloo", it was designed to be "404 Item Not Found Exception" in the time before JS had real exceptions.


A better example is querying from a database or api where you can specify the returned properties. null === queried but no value vs undefined === not queried




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: