Hacker News new | past | comments | ask | show | jobs | submit login
A complete guide to TypeScript’s 'never' type (zhenghao.io)
95 points by bpierre on March 11, 2022 | hide | past | favorite | 63 comments



What programmer hasn't at some point in their career written code like this:

    throw new Error(`Unexpected value: ${value}. This should never happen!`);
...and then had a user submit a bug report containing that very message. The `never` type is a nice way to model those scenarios and hopefully avoid such an error ever being seen by the user.

    // Compile error unless we've ruled out
    // `value` at the type level
    assertNever(value, new Error(`Unexpected value: ${value}.`));


The first one is a very useful error as a user though. It tells me the unexpected value isn't my fault and I should go report a bug.


The interesting thing here is you can still get the error and display the message! Just that you _might_ be able to catch issues beforehand as well, and avoid even getting to that kind of code.

Typescript's practicality here does let you walk and chew bubblegum. This sort of stuff is super practical for UI development, because ultimately at the interfaces with servers you're likely to make some assumptions about incoming data, and so you'll have that fundamental cast in your code from any to Foo, and this sort of error will be reportable.


Agreed that it's nice when an error message assures a user they did nothing wrong. But it's still better if the user doesn't encounter a bug in the first place due to being ruled out by the type system.


it's not a useful message to the laymen. The message is a contradiction, which makes no sense to an untechnical person (who is also unlikely to report it properly, from experience). The will say that the computer is broken, or that the program stopped.

Ideally, the user is presented with a general message telling them how to report a bug (link or something?), and include a base64 encoded blob, containing the context of the error (e.g., a stack trace, and may be also the very message in the exception), and ask the user to also submit that blob.


Wow, I would hate that. Fortunately I've never seen it before. Seems like "the user is too dumb to lay their lowly eyes upon our error message".


That is correct, the average user does not know and does not care to know about the inner machinations of the software. They aren't "dumb," they just don't care. And so the onus should not be on them to file your bugs.


https://en.wikipedia.org/wiki/Bottom_type#In_programming_lan...

if you want to know what it’s called in other languages (for some reason it doesn’t mention `Void` for Haskell though)


> Since there’s no values in the set, never type can’t never (pun-intended) have any value

That’s not a pun that’s just poor grammar. “never type can never (pun-intended) have any value” makes sense.


Irregardless of your subjective opinion about grammar, you literally expressed the exact same thing. It’s either a pun in both or it’s a pun in neither.


I have seen such double negative at many places, to represent strong negative. Don't know if it is some trend!


It’s fairly common usage in the US southeast, and in AAVE. Regarding it as poor grammar rather than an evolution of dialect is a matter of (snobbish) taste more than anything.


One major problem with it is that even in those dialects, a double negative can also be a positive, and outside of some semi-fixed constructs that mandate one or the other interpretation in that dialect, the distinction relies things like tone and/or stress/emphasis, which of course are lost in writing. This makes it rather unwise to use in writing, unless context clearly dictates one interpretation.


The more northern dialects are infected with certain negation-related illogicalities too. Consider this New York Times headline:

“All Options Are Not on the Table as Biden Moves Troops Closer to Ukraine” https://www.nytimes.com/2022/02/05/us/politics/biden-ukraine...

which should have been “Not All Options Are on the Table”, since the intended interpretation is

¬∀ option, on(option, table)

rather than the very different

∀ option, ¬on(option, table).


I am never not confused by this.


Funnily enough, in Russian the double negative is completely normal and that is a normal way to express many negatives. It always triggers my brain when I have to say a double negative in Russian, "just do it, it's completely normal, don't worry about it..." (inner feeling-based monologue).

https://www.russiantutoring.com/post/double-negation-in-russ...


Reminds me of a classic linguistics joke (I assume from a Tom Scott video given where I encounter linguistics the most):

A professor is lecturing on linguistics: "In English a double negative has a positive meaning. However, in some languages—such as Russian—a double negative still has a negative meaning. There isn't, however, a language in which a double positive holds a negative meaning."

A student in the back responds: "Yeah, right."


This is widely attributed to Sidney Morgenbesser (https://en.wikipedia.org/wiki/Sidney_Morgenbesser).


This reminds me of ruminations with a friend about the nuance of combined positive/negative affirmations. Eg “no yeah” used as an emphatic agreement in the positive, and “yeah no” an emphatic agreement in the negative.


The high-brow name for this syntactic gadget is “negative concord”[1], and Wikipedia lists plenty of languages that have it[2] (including several European ones and even Old English!).

[1] http://glottopedia.org/index.php/Negative_concord

[2] https://en.wikipedia.org/wiki/Double_negative


> That’s not a pun that’s just poor grammar.

There is no such thing as "poor grammar" --- it's a myth. Most of the grammatical "rules" we tend to be taught in primary and secondary school English courses were invented relatively recently, and they were created specifically for the purpose of establishing an arbitrary class distinction.

Now, there's something to be said about markedness [1]. If the majority of a specific group of speakers of a language find a particular utterance (phrase, sentence, pronunciation, etc) to stand out in a way that feels unnatural, then that utterance is called "marked". Clearly, "can't never" is marked in your perspective. (And, indeed, I also find it to be marked in my own dialect.)

However...

There do exist large groups of native English speakers for whom this utterance is not marked. In these groups, this is a regular way to emphasize a negative. So, in those groups, the utterance is unmarked.

So it is wrong to say that your notion of markedness is universal. Undoubtedly, there are utterances you would produce that members of other groups of native English speakers would find marked. (A simple example offhand: American English speakers find the British constructions of "go to hospital" or "to be on holiday" to be marked. Does that make these utterances ungrammatical universally? Absolutely not.)

Context is important. If you are writing, say, an academic paper that you want to publish at a particular venue, then you are expected to adhere to the grammatical conventions of that venue. But for general everyday speech? The rule is straightforward: if you understand the intent, then it is not incorrect. It might be marked for you, but that does not make it "wrong" or "bad" or "poor" universally.

[1] https://en.wikipedia.org/wiki/Markedness


I'm sure there's some validity to this from an anthropological/linguistic standpoint (I'm a big fan of "sounds prescriptivist but ok").

But look at your own comment:

> Context is important. If you are writing, say, an academic paper that you want to publish at a particular venue, then you are expected to adhere to the grammatical conventions of that venue.

This is a technical blog post addressing a technical audience. In fact the sentence in question is explaining a technical concept. Therefore the (unwritten?) rules of technical writing apply.

So it's inappropriate here to give lection on linguistic relativism.

For the purposes of the article, its audience, and the particular sentence in question, this is poor grammar.


> This is a technical blog post addressing a technical audience.

I disagree; it is a personal writing. If I write a blog post, I can use whatever grammatical constructions I like. I don't think there should be any expectation of a particular grammatical style in an individual's personal writings.

This was not a peer-reviewed article, nor was it submitted for publication at a specific venue. It was self-published on the internet.


I see "poor grammar" in the context of people writing things that they would never say, and would mark as wrong if they were to read it out loud.

I also see "poor grammar" from non-native speakers. Including me: Duolingo was correct today when it told be that I shouldn't use "tu es" as a French subjunctive. No native speaker would say that. I just couldn't hear that what I typed was different from "tu aies" -- though I should have known from context.

You're absolutely right that much of what people tag as "poor grammar" is in fact a different dialect. It's perfectly natural to them and the people around them, following a set of rules that they know but may never have written down. They may evolve from mistakes that become persistent and thus valid in context.

Speaking in the wrong register or dialect is an error of its own -- not a grammatical one, but a semiotic one.


FYI: Neither 'tu es' nor 'tu aies' make any sense to me (disclaimer: I have only learnt Spanish on the street). https://www.spanishdict.com/conjugate/ser

Perhaps you were trying to write down what you heard? 99% of the time there is a single canonical translation from verbal to written and vice-versa; English speakers have trouble learning that for some reason.


Sorry, I should have provided context. This was French, not Spanish. (I've clarified it in the original comment.)

If it had been Spanish, transcribing it would be easier. French mooshes a lot of things together, so that "es" and "aies" are both pronounced "eh".


Note that they are two different verbs (to be and to have, respectively), so the context should have made it clear which one you should use.


I don’t know much French at all but I do know my way around English in really silly ways, and it’s amusing to me to superimpose this distinction on an English language context. Because they’re two different verbs which we use in conjunction for one existential verb, which barely exists without recruiting other verbs. Having had this state of amusement, I will have left these thoughts for your future amusement if you’ll have them too.


Yep. It was completely clear. I just wasn't thinking clearly.


I think it is generally polite when editing to add something to show you edited it. Especially when you significantly change details, or if the change is due to a comment, otherwise the thread looks like nonsense.


I mean, I agree with all that, but also in practice there actually are English utterances that a vast majority of native English speakers will interpret as indicative of a profound lack of familiarity with the English language, and it's reasonable to describe those utterances in normal conversation as being "grammatically incorrect."


The technical term for such utterances is "ungrammatical", for what it's worth. This is a term with less uhhh shall we say "ideological baggage" than one built with the word "incorrect".

That said, the specific example being cited from the original blog post is not considered marked in many native English dialects.

And, indeed, I believe that we should be less concerned with such labels even with non-native English speakers. Since English has effectively become the world's go-to language of international communication, we should seek to understand other people's intent rather than attempt to correct arbitrary (and, frankly, unimportant) artifacts in their grammatical constructions.


> The technical term for such utterances is "ungrammatical", for what it's worth. This is a term with less uhhh shall we say "ideological baggage" than one built with the word "incorrect".

But my point is that people will use phrases like "grammatically incorrect" in casual conversation (far from any context of linguistics or any social science) to describe certain utterances, and there are cases where this is not unreasonable.


Any sources on poor grammar being a new thing? It's been taught in public/private schools in England for centuries, maybe even millennia.


Pedagogical history isn't a strong argument for something's truth, just its usefulness to someone. Theology was considered a key component of a higher education for centuries and is still taught for example.

People have always strongly believed, and taught, things about their own languages and languages in general. They're not all wrong necessarily. But linguistics over the last like 80 years has produced some models that have incredible predictive and explanatory power, and are incompatible with a view of an objective "correct" or "best" or "standard" form of a language.


I think it is relevant though if the original point was that grammar is a new thing to solidify class structures. I’m not saying that class structures and correct grammar aren’t relevant, I’ve been corrected by drunk humanity-studying Etonians enough times at the pub, but I don’t believe that it’s some recent invention to solidify class structures.


But what do they say about poor?


Sorry, I think I was not clear enough.

There have certainly been efforts to draw arbitrary distinctions between groups based on speech for a very long time, probably going almost as far back as language itself.

What I was referring to was specifically an effort rooted in a desire to use rules of Latin grammar to restrict constructions in English. This was a big thing relatively recently (likely sometime in the past 100-300 years). It is this movement from which we get rules like "never end a sentence with a proposition" or "don't split infinitives" --- those sorts of things.


> rules like "never end a sentence with a proposition" or "don't split infinitives" --- those sorts of things.

Just because those particular rules are silly doesn't mean all linguistic prescription is wrong. Prescriptions that clarify the meaning and relational structures between concepts can be very helpful, and discriminate only against careless users of language.

Grammar poor not does if exist, this like having have conversations fun.


Those things are irrelevant to the question of whether "never can't never" is incorrect in context, which is what you were replying to. Nor was the comment you replied to trying to draw distinctions between groups.


And beware that markedness is pronounced as three syllables like mar•ked•ness[1]

Is marked pronounced as two syllables, mar•ked, in the context of linguistics? I am guessing it would definitely be in British English[2].

[1] https://m.youtube.com/watch?v=vrZpjr44c1w

[2] https://www.reddit.com/r/linguistics/comments/dwo5gj/is_it_a...

Sorry for the poor source links, I only did a quick Google. Disclaimer: I am from New Zealand, and sometimes how we say things is special, and sometimes how I say things is even specialer.


In my linguistics courses (at a university in the United States), professors pronounced "marked" as a single syllable.

Generally, when indicating we want to give an E that is normally silent its own syllable, we use the grave accent. E.g., for "marked" we would write "markèd".

At least in General American dialects, "marked" is rendered as /markt/ because of rules about vowel epenthesis and consonant voicing assimilation. /markd/ would indeed sound odd to the American ear, I think. But when you explicitly add the vowel back in, the voiceless /k/ no longer forces the /d/ to assimilate its voicing characteristic, so you get /'mar.ked/. (I am using informal IPA because I'm too lazy to go copy/paste the specific symbols, but in this context they would add little anyway.)


>Most of the grammatical "rules" we tend to be taught in primary and secondary school English courses were invented relatively recently, and they were created specifically for the purpose of establishing an arbitrary class distinction.

That's a very hefty accusation. Do you have any proof that "they were created specifically for the purpose of establishing an arbitrary class distinction"?

>Now, there's something to be said about markedness [1]. If the majority of a specific group of speakers of a language find a particular utterance (phrase, sentence, pronunciation, etc) to stand out in a way that feels unnatural, then that utterance is called "marked".

Also known as "poor grammar".

>There do exist large groups of native English speakers for whom this utterance is not marked. In these groups, this is a regular way to emphasize a negative. So, in those groups, the utterance is unmarked.

Yep, in those groups the utterance isn't considered poor grammar.


> Do you have any proof that "they were created specifically for the purpose of establishing an arbitrary class distinction"?

Apologies; you are correct that my assertion is overly strong. I should rather have said "although we can no longer ascertain for certain the purpose for which these rules were introduced, some believe the origin to be rooted in a desire for creating an arbitrary class distinction."

Many of the "rules" we were taught can be traced directly back to an ongoing movement in the relatively recent past (like within the last 100-300 years, I think) to regularize English according to Latin grammar. This includes rules like:

- never end a sentence with a preposition

- never begin a sentence with a conjunction

- never split an infinitive

But we can find literary examples breaking any of these rules going back further than the origins of the rules. The rules were made up by people who revered Latin and thought English ought to be more Latin-esque.

The correlation to class distinction comes from the fact that people who tended to study Latin and revered Latin in this manner were members of an aristocracy, theocracy, or similar upper class, and there are other instances of these groups' efforts to arbitrarily distinguish themselves as being distinct from and superior to other groups of people.

There are also more recent efforts in a similar vein, as well. For example, in the United States there is a dialect commonly practiced by Black people in certain areas called African American Vernacular English. This dialect has a number of characteristics that are frequently labeled as "uneducated" or constituting "poor grammar". Although it is not controversial to say that these characteristics are marked in General American dialects, the specific connotations of the labels are generally understood to be rooted in a desire to portray Black Americans as belonging to a lower class, regardless of each individual's actual socioeconomic status.

> Also known as "poor grammar".

No, markedness is not also known as "poor grammar". There is a huge distinction.

The notion of markedness is that, for some group of people, a given utterance will strike them as not being something they themselves would say. Markedness is always framed in terms of a cultural context, by which I mean that it is okay to say that Americans generally find the British "go on holiday" to be marked --- it sticks out in the American dialects. This is a subjective assessment.

But "poor grammar" specifically seeks to attribute a negative connotation to the utterance. It says suggest that people who construct such utterances are, somehow, uneducated, or else are outsiders. It says that they've done something that is objectively (not subjectively) incorrect or wrong.

> Yep, in those groups the utterance isn't considered poor grammar.

Not always. Again, the terms are distinct in intent.

I don't think most Americans would listen to an English person say "I will go on holiday" and call it "poor grammar". They might say "that sounds funny" or "I'd never say it that way" or whatever else, but I think most would not use a phrase with such a strongly negative connotation. Many Americans view people with English accents (specifically accents closer to Received Pronunciation, though others are included) as having increased intelligence or other intellectual merit, for whatever reason, and so they will avoid labeling the speech of such people to be "incorrect" or "wrong" or "bad"; they will instead simply label it as "different".

However, when many (especially white) Americans hear utterances like "I been going to the store" or "I done finished my homework" or "I ain't never seen such a thing", they will label those as "poor grammar". They are specifically making a negative assessment of such utterances.

But all the examples are considered equally marked in General American dialects.

So my desire to distinguish markedness from "poor grammar" is rooted in a desire to fight stigmatization. There is nothing inherently wrong with those latter examples I gave, especially because almost all native English speakers will understand their intents (at least generally) without issue.


> But "poor grammar" specifically seeks to attribute a negative connotation to the utterance. It says suggest that people who construct such utterances are, somehow, uneducated, or else are outsiders. It says that they've done something that is objectively (not subjectively) incorrect or wrong.

You're completely ignoring that people do make mistakes and get corrected. There is nothing in the post you originally replied to implying that anyone is uneducated or an outsider - you're introducing that. "Poor grammar", "bad grammar" and "incorrect grammar" are interchangeable and aren't about class.

If I accidentally type "Eat I bread", it's incorrect grammar even if there happens to be some dialect somewhere where it is valid.


> There is nothing in the post you originally replied to implying that anyone is uneducated or an outsider

That's literally what the "poor" in "poor grammar" is doing, though. The phrase itself suggests that the speaker is lesser-than simply by virtue of not speaking in the same way as the listener would.

> If I accidentally type "Eat I bread", it's incorrect grammar even if there happens to be some dialect somewhere where it is valid.

It is not "incorrect". It is different. It is marked, maybe, for many native English speakers. But these are distinct things from "incorrect".

I do not understand how you can say "There might be a group somewhere where who find it valid, but I don't find it valid and therefore it is incorrect." Why is your personal dialect the one that determines correct/incorrect? If you are American, is it incorrect to say "to go on holiday"? If you are British, is it incorrect to say "Microsoft is going to X" instead of "Microsoft are going to X"?

The use of the terms "correct" and "incorrect" suggests that there exists a universal ground truth, and my whole point in this discussion is that no such ground truth exists. If there is no universal ground truth, then there can be no correct/incorrect. The only way to get around this is to establish a specific cultural context where a ground truth can be agreed upon (e.g., a style guide used by a particular publication venue). Otherwise, at best you can only say "I would not phrase it in that way", but you cannot say "that is incorrect."


> That's literally what the "poor" in "poor grammar" is doing, though. The phrase itself suggests that the speaker is lesser-than simply by virtue of not speaking in the same way as the listener would.

No, the "poor grammar" is just meaning "grammar that is incorrect" - I assume you're claiming that the meaning involves the speaker being poor? If that's what you believe, then I doubt anyone will convince you otherwise.

> It is not "incorrect". It is different. It is marked, maybe, for many native English speakers. But these are distinct things from "incorrect".

Literally everyone in this thread is using "incorrect" and being understood by everyone, so I don't see that you have any ground for claiming that "incorrect" is incorrect.

And you're still entirely missing that some things can be mistakes.


>The use of the terms "correct" and "incorrect" suggests that there exists a universal ground truth, and my whole point in this discussion is that no such ground truth exists.

The universal ground truth is the intersubjective understanding of language norms.

>The only way to get around this is to establish a specific cultural context where a ground truth can be agreed upon (e.g., a style guide used by a particular publication venue).

Yes, and such context exists. The main prestige dialect in the US is General American, all other dialects are marked geographically, culturally, professionally or in some other similar manner.


>I should rather have said "although we can no longer ascertain for certain the purpose for which these rules were introduced, some believe the origin to be rooted in a desire for creating an arbitrary class distinction."

That sounds more believable. Some people believe that, and some people believe that moon landing wasn't real. Many languages like Spanish, French, Russian or Chinese are regulated on the national or even supranational level. That feels like a good thing and it shouldn't be discouraged just because some people believe in weird things.

>No, markedness is not also known as "poor grammar". There is a huge distinction. >But "poor grammar" specifically seeks to attribute a negative connotation to the utterance. It says suggest that people who construct such utterances are, somehow, uneducated, or else are outsiders. It says that they've done something that is objectively (not subjectively) incorrect or wrong.

Markedness is just as objective as poor grammar. It is all about a specific group of people in a specific time and place. As time changes so does grammar and what is consider poor.

>Not always. Again, the terms are distinct in intent.

Well, in the strict sense "markedness" has different meaning. The way you described more or less resembles poor grammar in most cases.

>I don't think most Americans would listen to an English person say "I will go on holiday" and call it "poor grammar". They might say "that sounds funny" or "I'd never say it that way" or whatever else, but I think most would not use a phrase with such a strongly negative connotation.

They probably understand that it is a standardized learned prestige dialect.

>So my desire to distinguish markedness from "poor grammar" is rooted in a desire to fight stigmatization. There is nothing inherently wrong with those latter examples I gave, especially because almost all native English speakers will understand their intents (at least generally) without issue.

Yes, I understand. Americans have a knee-jerk reaction against everything that feels a tiny bit racist. However propping up a single standardized language and discouragement of deviations is a path that most nation-states have embraced.


Given that the OP has posted to say that it was a mistake and hence isn't "unmarked" as you put it, you've quite missed the point. Not all mistakes (and corrections) are evidence of some classist plot.


On the contrary, I believe you missed my point! My issue wasn't with the correction; my issue was with the use of the term "poor grammar". I do not believe it is acceptable to use that term in any context, because it has a specific history and connotation.


hey thanks for pointing that out. I should have said "can never" instead. it is hard being a non native english speaker


This is the missing docs on the type never. Very nice write-up!


Didn't know it is possible to do exhaustive matching in TypeScript, quite impressed to be honest, but this looks a bit too complicated to be used widely


Is the never type just an alias for an empty union, or are these two somehow different types in TypeScript ?


Great writeup on `never`, love to see it!


tldr: A method that is typed as returning `never` either never returns (infinite while/for loop) or throws an error. And as return types can also be conditional, your given inputs may switch the return to `never`, allowing typescript to tell you that code following that line will never be executed.


This is only one use case for never. The article goes into a lot more than this. For example using never to create derived types is pretty common.


nice


Sweet! 'never' seems to have historical roots in IE's ancient 'unknown' js type.


As long as you're stuck with null and undefined, you might as well throw in never for $3 billion!

"My favorite is always the billion dollar mistake of having null in the language. And since JavaScript has both null and undefined, it's the two billion dollar mistake." -Anders Hejlsberg

"It is by far the most problematic part of language design. And it's a single value that -- ha ha ha ha -- that if only that wasn't there, imagine all the problems we wouldn't have, right? If type systems were designed that way. And some type systems are, and some type systems are getting there, but boy, trying to retrofit that on top of a type system that has null in the first place is quite an undertaking." -Anders Hejlsberg


But it's worse that that.

    var foo = new Array(5)
That code will make an array with 5 elements, but their values aren't actually null or undefined, but another kind of "empty" value. You can't access this value directly (it will convert to undefined), but it still exists.

EDIT: there's another kind of undefined that happens within the Temporal Dead Zone too


This `never` type is the opposite of a null value, though. The "billion dollar mistake" (which is really the absence of non-nullable references, not merely the presence of null or undefined) adds a possible value to every reference. At any point you could have a valid reference, or a null reference, and this makes it impossible to express the type of a reference which must be valid. The `never` type by contrast has no possible values. It offers a way to statically assert that nothing has been left unhandled.

A type system without `never` (or equivalent) is like a number system without zero. (Literally—`never` is the zero for algebraic data types.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: