Hacker News new | past | comments | ask | show | jobs | submit login
JavaScript Is Weird (jsisweird.com)
376 points by robin_reala on June 28, 2021 | hide | past | favorite | 369 comments



0.2 + 0.1 === 0.3

That's not really a JS problem, that's a floating point problem. Plenty of languages will have the same issue.

+!![]

"" - - ""

(null - 0) + "0"

Calling these things weird is fair enough but I can't help thinking this is code you'd never actually write outside of the context of a "Look how weird JS is!" post. It's like picking examples from the Annual Obfuscated C Contest to show how hard it is to understand C. Yes, some languages enable you to write weird garbage code that's hard to reason about, and ideally they wouldn't because enforcing sane code would be great, but come on. Most of these things aren't a big problem that we suffer every day. Just be moderately careful about casting things from one data type to another and all these problems go away.


I think the situation is a bit different. This situation looks different in reality. The result may be (null-0)+"0", but the actual code will be foo()-bar()+baz(). And C will at least give you warning about types, even if NULL-0+"0" could give you an address. Plain JS without extra tooling would happily give you the unexpected result. Some other dynamic languages would at least throw an exception about incompatible types for -/+.


We've had these exact same sorts of issues in PHP. It can go undetected for awhile and cause subtle bugs. A good type system helps a lot. I appreciate that Kotlin is more stringent than Java with no implicit conversions between Int/Long/Float/Double.


I once lost most of an afternoon debugging an issue where orders in a PHP e-commerce system would very occasionally fail.

Turns out several months before, someone was doing some refactoring, moved some methods around, but also changed a "==" to a "===" in the process. Generally a good idea, but it slipped through to production without anyone noticing or breaking any tests.

The issue ended up being that a rare code path in a tangentially related method would cause that method to return a float instead of an int. This propagated through, eventually causing a check of 0.0 === 0 to fail where previously 0.0 == 0 passed.


The problem here is == was used in the beginning. Always use ===.


Unfortunately "use X from the beginning" is rarely a solution when you're no longer at the beginning.


Because it's more to do with weak typing as opposed to dynamic typing. Many dynamic languages are strongly typed.


> C will at least give you warning about types

For the same reason that, as the saying goes, there is no such thing as a "compiled language", no, "C" doesn't give you a warning about types. The compiler you're using gives you a warning. If you want a typechecking pass over JS, then use a tool that does typechecking. Eschewing with a typechecker and then complaining about the lack of type mismatch warnings, however, makes little sense.


Sure, we can go into very true, but specific cases. But in a day to day usage: When you're writing plain JS, you have to do extra work to get type checks. When you're writing plain C, you have to use an unusual environment to not get basic type checks.


So? The majority of the business world does their desktop computing with Microsoft Windows, but it doesn't mean you have to. The same principle applies here. If you don't like your environment, fix it. Choosing not to and then complaining about the result makes little sense.


You don't control the whole environment. You'll likely use some libraries where people didn't use type checkers and wrote libraries in a complicated enough way that the analysis cannot give you an answer. This is where you control some of your environment and fixing it involves forking dependencies and more maintenance burden if you really do want to do it.

In this case, complaining about the environment as a whole does make sense.


Yeah guess this is where Typescript comes in


TypeScript wont magically fix type errors. It is not that hard, to sanitize any expected parameter for functions and their output, typescript wont do that for you. So if you typecheck i/o values per se, using typescript only slows down dev process, as this can be easily done in vanilla javascript. no need for more bloat, but only for some defensive programming.


TypeScript "magically" fixes the need for defensive programming by ensuring you won't write unsanitary code or invoke functions with possibly null&undefined values by accident. So then clearly the defensive programming is the bloat, because you could avoid it entirely by putting a type system in place to prevent you ever putting yourself in a situation where null&undefined get passed to functions you don't want it to be.


This falls apart the moment you're pulling remote data at runtime. You're right back to defensive programming since there's no more type system to help you at that point.


That's nowhere near "falling apart". That's just the simple fact there's no silver bullet.

Someone arguing a little defensive programming is equivalently strong to a type system is clearly unaware just how much work a good type system does for you. I of course agree with you: when you're fetching data that you don't know the type of, recklessly casting it to some type is going to cause issues. This is true in every language that has ever existed. It's also why tools like IO-TS[0] exist, and of course you can enforce this with JSON Schema techniques or custom validators or a million other options.

Edit: in case the ultimate point of this comment is not clear, by using a type system and some type validator on your fetches, you are able to reduce the need for defensive programming exclusively to your fetch points. Clearly, defensive programming at the fetch points was already needed, so this is why I do not agree with the claim TypeScript's value add disappears from remote fetches.

[0]: https://github.com/gcanti/io-ts


So you are saying that because we need to do validation in a very specific case, we should just throw the towel and do validations every time?

The IO entry point of your code will always be unknown no matter what programming language you are using. In typescript, you do validations in these cases to make sure outside data fits into your type system, from then on (probably about 99% of the rest of the code) won't need any validation whatsoever because the compiler is already doing it for you.

Bloat is the amount of time you lose doing code review to check if things are possibly null or doing null checks on stuff that is never null or a bunch of other stuff that the compiler will do for you just by writing some minimal types. The compiler does this stuff automatically without getting tired, can't say the same thing for humans.


so when the transpiled typescript is used by the next door 'i know javascript'-1337-hax0r and is fed with some arbitrary data, your wonderful conceptual typed world does not exist anymore and that wonderful code eventually fails, because a simple sanity check was too much.


This seems like a pretty bad-faith comment. This user is not proposing they're building a library, and if they were, it would be reasonable to assume they also would assume to extend the IO protections they discuss to the "IO" points of library - external user input.

Additionally, I think it is safe to say a JavaScript user who grabs a TypeScript library and uses it without importing the types to be misusing the library. Imagine if someone were to have a whole test suite was available to them while they develop, and they opted to never run it. And then they complained to you the tests didn't catch any errors. You would look at them sideways, no? Misuse and poor application (human error) are of course things TypeScript cannot solve.


> no need for more bloat

What bloat are you referring to? TS compiles down to plain JS.


Toolchain bloat is still bloat.


To call TypeScript, a very strong type system which compiles down to terse JS with no extra JS for even complex types, and thus accordingly removes the need for all sorts of tests, "toolchain bloat" is a fairly one-dimensional view of things

Edit: I accept the downvotes for my tone and have updated it. However, I do feel that by the exact same argument "toolchain bloat" exists, surely one could seamlessly argue "testing bloat" exists, and it should be transparent from the popularity of TypeScript that it's a good tradeoff


Yup, garbage in garbage out. I'm not a huge fan of JS, but this sort of criticism is absurd.


I rather have the language say "Error, this is garbage!" than silently output garbage.


The sad truth is that this stuff was not in the first version of JS. It was added AT THE REQUEST OF DEVS (a decision Eich has said he regrets).

Like most bad things, it's only around because a big company said so.

Like all bad things in JS, there was a push to remove it at ECMA, but Microsoft had reverse-engineered JS into JScript and refused to go along with the changes to fix the weirdness.


The root of the problem is the original intent of Javascript. Javascript was intended to be a small layer of dynamism added to web pages that were mostly defined via HTML which were presented to a human for interpretation. When your user agent is a human trying to look up an address for a restaurant, they can look at a garbled piece of crap where the JS crashed and still maybe find what they were looking for in the rendered text. Limp along the best you can is a great failure strategy for Javascript's original use-case. Only now that we've turned Javascript into a general purpose programming language is this a failure.

Even Javascript's weak typing makes sense in this case. Why automatically convert everything? Becausethe expectation was that inputs to your Javascript would be HTML attributes which are all strings. Automating type conversions from strings made sense. But once you move to larger scale programing, Javascript's weak typing is awful


Which is why we use Typescript


The fact that there has to be a different language on top of your language to make it same days all that needs saying really.


I keep having this argument with my boss but he refuses to let me write machine code.


If the good lord wanted us to code in assembly language, he'd have made transistors operate on mnemonics, not electric currents.


If the good lord had wanted us to interact with transistors based on mnemonics, not electrical currents, he'd have implemented our brains in mnemonics, not electrical currents.


Nitpick: voltage potentials and ion channels. There's not much actual current flowing.


Nitpick over nitpick: synapses are not exactly electrical or ion current based, they have active transporters.

If all electrical activity ceases, does memory survive? (Answer is very likely yes given cryonic experiments. Brain is protein, ion currents have tendency to auto fire on defrost.)


Ah, I was under the mistaken assumption that ion transport was a subset of ion channels but I see now the latter is passive only.


This goes for all compiled languages?


We invented compiled languages because of issues with writing everything in assembly. In other words, we invented C because assembly wasn't very good.

Just like we invented TypeScript because JavaScript wasn't very good.


Pragmatism of an runtime/ecosystem that highly favors backwards compatibility above all else.


Man do I hate my phone's autocorrect. For anyone confused as shit:

The fact that there has to be a different language on top of your language to make it sane says all that needs saying really.


Typescript doesn’t save you from all the weird things happening at runtime. A missing check at a context boundary and you can have a wild time (been there).


Context boundary meaning where it interfaces with javascript?


The points where you parse JSON, for example.


This particular annoyance has made me a huge fan of Elm's (and other languages') JSON decoders, and specifically No Red Ink's JSON decoding pipeline. All the type safety I could ever want and no falling back to writing defensive JS to maintain safety at runtime.


I've been thinking for a while that modern languages shouldn't default to floating point computations. They're exactly the right thing if you do data stuff or scientific computing, but given how much of the internet relies on things like e-commerce and how often floating point is still misused for dealing with money, coupled with the fact that even many senior developers aren't fully aware of the subtleties of floating point, I don't understand why we keep making them the default.

If a user does need to do scientific computing, they could use a "# use-floating-point" pragma or something so that literals get interpreted as floating point. Otherwise, we map them to rationals.

Of course, doing rational arithmetic is much slower (for example, Gaussian elimination is exponential for arbitrary precision rationals, while it's famously O(n^3) for floating point), so there's a danger of people accidentally using it in triply nested loops etc., but I have a feeling that if you need to do that kind of thing you know that you might run into performance issues and you think twice.


I have been a professional software developer full-time for 12 years and I have only worked on one system that needed an exact decimal representation for money. I just don't deal directly with payments, billing, or account balances. I do sometimes have to represent money, but in simulation or estimation scenarios where nobody gets cheated or audited if it's $0.01 off.

Arbitrary-precision rationals would get extremely hairy extremely quickly and simply break down for a huge number of use cases. People use exponentiation, square roots, and compound interest! If a "senior developer" who actually works with real account balances doesn't understand floating point, why would you expect them to understand why they can take the square root of 4, but the square root of 2 crashes their application? Or why representing the total interest on a 30-year loan at 3.5% takes over 400 bits?

The reality is that software engineers need to understand how computers work for the field they're working in. Many (if not most) programmers will never encounter a situation where they're responsible for tracking real account balances. The ones that do simply need to know how to do their job.


I agree. I think a high level programming language like JavaScript should default to the more "correct" (least surprising) behaviour, and let the programmer opt in to floating point numbers when needed for performance.

In modern JavaScript, you can use BigInt literals by suffixing an n, like this:

  const maxPlusOne = 9007199254740992n;
If I could magically redesign the language, I would make all integer literals be effectively BigInt values, and I would make all decimal literals be effectively decimal values so that 0.1 + 0.2 === 0.3. I would reserve the letter suffixes for more performant types like 64-bit floating point numbers that have surprising behaviour.


> "# use-floating-point" pragma

I haven't come across the "pragma" directive in a Javascript context. I guess it's another new feature (I have trouble keeping up these days).


I don't think they have it (although, you can do whatever you want with babel nowadays, I guess). It was a suggestion aimed at programming languages in general.


The javascript equivalent would be an expression like 'use strict';


I was being flip; I should have used a smiley, I guess. Sorry.


Having the full number stack supported (natural numbers - integers - rationals - reals) would be, indeed, awesome.

Sadly, most people don't understand the distinctions, so this will never happen. (Even in reply to your post people keep talking about "decimals", as if the number base is at all relevant here.)


"Decimal" usually refers to a data type which is "integer, shifted by a known number of decimal places". So, for example, if you had an amount of money $123.45, you could represent that as a 32-bit floating point number with (sign=0 (positive), exponent=133, mantissa=7792230) which is 123.4499969482421875, but you would probably be better off representing it with a decimal type which represents the number as (integer part=12345, shift=2), or as just a straight integer number of cents.

The number base is relevant, because money is discrete, and measured in units of exactly 1/10^n, and if you try to use floating point numbers to represent that you will cause your future self a world of pain.


Non-decimal currencies have existed in the past and there are still some remnants: https://en.wikipedia.org/wiki/Non-decimal_currency


Yes, that's a good point. If you try to use decimal for those currencies, you are in for a very similar world of pain as you are by using floats for decimal currencies.


> reals

This is technically impossible. Almost all of the real numbers can't be represented in a computer. The most you can get is the computable reals. But to be able to represent some of those and to do arithmetic on them, you have to use somewhat complicated representations such as Cauchy sequences.

The use cases for computing with exact representations of (computable) irrational numbers are fairly limited (probably mostly restricted to symbolic algebra). In 99% of cases, if you need the square root of 2 that's probably because this comes from some sort of measurement (e.g. you need the diagonal of a square with sides 1), and if it's a measurement, there's going to be an error associated with it anyway and there's no point in insisting that this number that you measured is "exactly" sqrt(2).

This is different from rational (and, in particular, integer) numbers, in which case there are many valid use cases for representing them exactly, e.g. money.


> This is technically impossible.

It's technically impossible for the integers too.

> Almost all of the real numbers can't be represented in a computer.

Almost all of the integers can't be represented in a computer.

What, exactly, is your point?


These two things are not alike.

For every integer, there exists a computer that can represent it. Even with constant memory, I can right now write a computer program that will eventually output every integer if it runs for long enough.

By contrast, for almost all (i.e. an uncountable number of) real numbers there exists no computer whatsoever that can ever hope to represent any of them.

Another way of seeing that is that, while Z is infinite, any single integer only requires finite amount of memory. But a real number may require an infinite amount of memory.

The integers can also be represented fairly easily as a type. For the naturals, for example, it's as easy as

  data Nat = Z | S Nat
(ML-type languages allow to do this very concisely, but you can do theoretically the same type of thing with e.g. Java and inheritance; if you use Scala or Kotlin, use a sealed class, if you use Swift, use an enum, etc.)

The integers are slightly more complicated (if you just try to add a sign, you'll have to deal with the fact that you now have +0 and -0), but still not hard. Rationals are a bit harder in that now you really have multiple different representations which are equivalent, but you can also deal with that.

By contrast, you won't be able to construct a type that encodes exactly the set of real numbers. The most you can do is to provide e.g. an interface (or typeclass) Real with some associated axioms and let any concrete type implement that interface.


You're trying to explain the difference between countable and uncountable infinities here.

The distinction is irrelevant in the context of computers.

> By contrast, you won't be able to construct a type that encodes exactly the set of real numbers.

You don't need to encode exactly, just like you don't need to encode the integers "exactly".

All you need is a way to guarantee a finite number of significant digits in your real number approximation.

Floating point numbers give you that, problem solved.


> Floating point numbers give you that, problem solved.

No. For example, floating point addition is not necessarily associative. In that sense, floating point numbers aren't even a field and it's wrong to say that they can be used as a stand-in for real numbers.

Floating-point numbers are incredibly useful and it is amazing that we can exactly analyze their error bounds, but it's wrong to treat them as if they were real numbers.


> That's not really a JS problem, that's a floating point problem

More accurately it's a binary problem. 0.1 and 0.3 have non-terminating representations in binary, so it's completely irrelevant whether you're using fixed or floating point.

Any number that can be represented as the sum of powers of 2 and 5 have a terminating decimal representation, whereas only numbers that can be represented as the sum of powers of 2 have a terminating representation in binary. The latter is clearly a subset of the former, so it seems obvious that we should be using decimal types by default in our programming languages.

Oh well.


While true, there will be rational numbers you can't represent as floating-point numbers no matter which base you choose. And the moment you start calculating with inexactly represented numbers, there is a risk that the errors might multiply and the result of your computation will be incredibly wrong. This is the much bigger "problem" of floats, not the fact that 0.3 is not "technically" 0.3, but off by some minuscule number.


It's not a binary problem, it's a particular binary representation problem. You can represent 0.1 and 0.3 such that it terminates. In Java for example just use BigDecimal (integer unscaled value + integer scale) and you are ok.


The floating point standard defines various float types, the common float/double types are called binary32 and binary64. It also defines decimal types.

> integer unscaled value + integer scale

binary float does the same, just using 2^exp instead of 10^exp for scale.


In pure mathematics, you can get perfect precision with non-terminating fractions. For example, 0.(6) + 0.(3) = 1 is true. The decimal (or binary) representation is just "syntax sugar" for the actual fraction - in this case, 2/3 + 1/3 = 1; or, if you prefer, 10/11 + 1/11 = 1, or 0.(10) + 0.(01) = 1.

Note: I'm using a notation for infinitely repeating decimals that I learned in school - 0.(6) means 0.6666666...; 0.(01) means 0.010101010101...


Yes, it's a theorem that every rational number can be represented by a decimal number that is either terminating or repeating.


Floating bar numbers are an interesting way of giving terminating representations to more commonly used decimals. Each number is essentially a numerator and denominator pair, with some bits to indicate the position of the division bar separating them.

https://iquilezles.org/www/articles/floatingbar/floatingbar....


> 0.1 and 0.3 have non-terminating representations in binary

No. "1", "3" and "10" can all fit easily in just four bits.

Just use rational numbers and solve the problem for good.


No, it is problem for any base. For example decimal system can represent 1/5,1/4, 1/8 and 1/2 properly. But, what about 1/3, 1/7, 1/6, 1/9 as decimal numbers with finite number of digits.

This will be a problem for any base representation when it has to be boxed in finite number of digits or memory.

One good thing is decimal is widely used format, so it is good to go with that for representing stuff. But, it is more of an accidental advantage that decimal has. Nothing more.


Did not read the whole parent comment and my message is redundant. But, yes, objectively decimal can represent more numbers (the numbers that are composed of 1/2 and 1/5).

I think there are arguments to use decimal as representation is there, (where I originally came to know about this problem) [1].

[1] - https://www.crockford.com/dec64.html


Discussion on hn about this : https://news.ycombinator.com/item?id=16513717


Bring back BCD hardware (hmm. Does x86 hardware have built-in BCD arithmetic?)


Yeah, a lot of these have nothing to do with JS. I have no idea what the site is trying to accomplish.

I mean:

    !!!true
In what language does that (or its equivalent) not evaluate to false?



Great point, I'll edit my comment to say "(or its equivalent)", instead of "(that precise sequence of characters, no matter what you've redefined true and false to be)". Not sure why I originally wrote it that way.


One might want to add that it does work when you use a variable however. https://play.golang.org/p/6UfNFWm_-JR


The issue isn't that they're constants, but that GP called them "true" and "false" and assigned the opposite values you'd expect. You can break it in exactly the same way with variables: https://play.golang.org/p/EVU84l0A57I

Kinda crazy to me that Go doesn't reserve the words "true" and "false", but ¯\_(ツ)_/¯


Can anyone explain this to me?


Looks like the `true` and `false` _bindings_ in Go are mutable and can be re-assigned. The same thing was possible in Python 2, IIRC:

    False, True = True, False # Have fun debugging!


Ah, I missed that line. Now I feel stupid ;)


!!!true is equal to false in JS, and in Go (and in any other language I can think of.)

In that Go example above, the author is reassigning true and false to be their opposites.


`!` is added to values in most languages as a shorthand for saying "give me the opposite boolean value of this. So `!true` would equal `false`

Some people add two exclamation points as a shorthand to cast a value to a boolean. So if you wanted to see if something was 'truthy', you could say `!!truthyValue` and it would return ` true` instead of the value itself. Literally what your asking the language is "give me the opposite boolean value of the opposite boolean value of `truthyValue`"

Now you can probably see why three exclamation points is silly, it's not giving you anything that a single exclamation point wouldn't give you. Both `!truthyValue` and `!!!truthyValue` evaluate to false, you are literally saying "give me the opposite boolean value of the opposite boolean value of the opposite boolean value of `truthyValue`"

The example in Go is intentionally misleading because Go lets you reassign the values for `true` and `false`. It's going through all the same steps I described above, but it's starting with a value opposite of what you think it is


They're defining a const named true and false to their inverse values, shadowing the builtin true/false keywords.


The site does say so in the introduction

> Even if you're a JS developer, most of this syntax is probably, and hopefully, not something you use in your daily life.

So I think you should look at this site more as something fun you might not have known if you're an js developer than as criticism of js.

That being said, the !!"" isn't that weird of a syntax is it? I see and use the double exclamation mark all the time.


The output is "weird" if you don't know the rules for operator precedence and how things convert to their primitive values.

Most people arent going to "know" what '+!![]' will resolve to because it makes literally no sense to combine those operators into a single expression in anything approaching normal code.


I have definitely used the +!! "operator" before. It coerces a value into a boolean integer. It's not weird at all, just a mechanical application of the not ! and Number coercion operators +.

The fact that [] is truthy is something everybody learns in their first weeks of JS programming otherwise you would be writing `if (someArray)` and wondering why your code is broken.

A weird one would be to explain why +[] is 0 and +{} is NaN. That is nonsensical.


This is true; a lot of the questions are about automatic type conversions.


Poor frontend devs.. stuff like this has to result in incredibly painful debugging because of assuming code would act one way when it does something else entirely.

> the !!"" isn't that weird of a syntax is it?

Why would someone say not not empty string in code somewhere? Or do you mean seeing !!var_that_could_have_empty_string isn't too weird?


That's what tests and code review is for. Truth is that if any of this kinda code gets into prod, then you got bigger problems than some arcane JS gotchas.


I doubt that anyone who hasn't written JS will recognize !! as an idiom for converting to boolean.


That kind of type coercion predates JS.

  perl -e 'print !!"whee"'
  1
  php -r 'print !!"whee";'
  1
  awk 'BEGIN {print !!"whee"}'
  1


[flagged]



[flagged]


There is no such thing as standard English. The link says only that the usage is informal, and HN comments are not formal writing.


If your comments were comprehensible to most HN readers, I wouldn't see the problem in using Hiberno-English, or any other colloquialisms.


[flagged]


That's completely your opinion, though - there's nothing in the guidelines or FAQs that say that.

When on HN, I engage with things I find interesting, irregardless of how good their ritten. If we intimidate users with an expected level of ability in written English, we'll be excluding a lot of interesting comments.


English is as weird as JavaScript.


no, it's fine - you just need the right education https://publicdomainreview.org/collection/english-as-she-is-...


In my humble and admittedly little experience, its not what you intentionally write, but what gets unintentionally written and needs to be debugged later.

Carmack-like people utilising this stuff for good are few and far between, for your everyday joe programmer this is a footgun, a very-nonobvious and non-intuitive one (hence footgun moniker) that they write in crunch, it slips past reviews because reviewers are just joes with couple extra years, if at all, and then whrn things break after deployment, its an absolute pain to debug.


Yea.. "clever" devs can be a nightmare to have in teams. "I implemented all of that functionality in less than one hour" is great and all but it is just tech debt that needs to be re-factored later and costs ridiculous amounts of time to support and maintain until its re-written.


> Calling these things weird is fair enough but I can't help thinking this is code you'd never actually write outside of the context of a "Look how weird JS is!" post.

That is the whole premise of the site, though. They even say that these examples aren't common syntax or patterns before you start.


The site is called "JavaScript Is Weird", not "Weird Javascript", even if they tell you that the examples aren't common they're still saying that this weirdness is unique to JS. Which definitely isn't true in the case of basic floating point precision problems


> The site is called "JavaScript Is Weird", not "Weird Javascript"

am i being punkd?


The former is a general statement about Javascript itself as a whole while the latter is describing a set of examples.


The same for “== considered harmful”. I scanned the entire comparison table and the only unobvious or error-prone cases are those you never really do in programming.

https://stackoverflow.com/a/23465314

For me it’s only rows [[]], [0], [1], i.e. array-unfolding related, but all others are regular weak-typed comparisons like in perl and other dynamic semantics. <snip> Edit: just realized “if (array)” is okay, nevermind.


> the only unobvious or error-prone cases are those you never really do in programming.

You never do them on purpose. The problem is when you do them by accident because of a mistake in your code, and the error slips through unnoticed, doing the wrong thing.

> weak-typed comparisons like in perl

Perl has separate operators for working on strings vs numbers, so you are always explicit about performing a numerical vs string comparison etc. Not so for JavaScript.


you are always explicit about performing a numerical vs string comparison

But the values which you compare do not have to be of the same type, and it does not end at scalars (which are really ephemeral in their exact typing even in native API, see perlapi). Basically it has two flavors of ==, each with its own preferred coercion. Perl also has contexts, e.g. boolean and scalar which can operate on lists and hashes (@list == 5). While the form is different, semantics are similar.

The problem is when you do them by accident because of a mistake in your code, and the error slips through unnoticed, doing the wrong thing

If your string array contains some [[0]] by accident, I’d say there is not much left to do anyway. === doesn’t report this either, it only has narrower trueness, which may decrease or increase an error surface depending on how your conditions are spelled. And to be clear, I’m not arguing against using === in places where identity^ or strict equality check is necessary or desired (you may desire it most of the times and that’s valid). My concern is that (a) everyone blames == anywhere, for completely zealous reasons, (b) I have to type a poem to test for null and calm down someone’s anxiety.

^ I know it’s not exactly identity, but is close enough for practical purposes


Agree. The tests are silly, but each test highlights a gotcha that you might bump into while debugging some horrible heap of legacy code.


undefined and null sometimes make problems. IMHO it's good that undefined == null but some people don't realize.


I agree. If native apis didn’t return nulls in some cases, and undefined was named “undef” at least, null could be ditched. But then again, it’s only because js has no bad habit of treating the same of non-existence and undefinedness^. If not json (which has no undefined), we could ditch null. But it’s there and with === it leads to either

  object.someField === null || object.someFeild === undefined
madness, or to a potential error if a programmer thinks that null can not be there.

We could do an exception for null === undefined, but it’s against its spirit and will not be accepted.

^ languages that treat non-existent key as undefined are usually doomed to introduce Null atom in some form, or to work around that limitation constantly in metaprogramming


null and undefined is one of the things Javascript actually got right imo. Undefined is what lets Javascript turn what other dynamic language would throw as a runtime error into a value. "I do not have a definition for the thing you want" and "the thing you want is known to be unknown" are two totally different concepts that languages with only null to lean on must collapse into a single concept.


I've never found this to be a useful difference in practice. Also, JavaScript really doesn't use it correctly to begin with. Arrays are a mess. Like the poster below, I usually cast to one of them, except I cast to null.


The problem is undefined is used in places null makes much more sense (like a result of the find function). I follow the rule to never use null and convert to/from undefined if needed by some library.


Leave it to javascript to pull defeat from the jaws of victory.


Some of them are stretched examples, others comes from other languages/constraints (floating point, octal, ...), but some other are legitimately weird and error prone:

[1, 2, 3] + [4, 5, 6] // -> "1,2,34,5,6"

[,,,].length // -> 3


The first is only weird if you expect the + operator to perform an operation on arrays. It doesn't, so each array becomes a string and those two strings are concatenated.

The second is only weird in that you are constructing an array with implicit undefined values, which is exactly what I would expect to happen if my linter didn't complain about the syntax and I had to guess what might be happening.


> The first is only weird if you expect the + operator to perform an operation on arrays.

Like GP said, comes from other languages. That line can be copy/pasted into python and it performs concatenation.


Both of these examples are well-known (and not unexpected) behaviours. I assume you already know why it behaves like that. If not, I can explain it.

> [1, 2, 3] + [4, 5, 6] // -> "1,2,34,5,6"

What would you expect instead?

> [,,,].length // -> 3

Is there any use case where you would want to deal with sparse arrays?


[1,2,3] + [4,5,6]

An addition operation over two numerical vectors of length 3.

I would expect the result to match its inputs and provide a numerical vector of length 3.

Thus: [5, 7, 9]


The issue there is that arrays aren't first class types in JS, they're just objects with numeric property names.

So if applying an operator to arrays spread the operation across all the elements that way, it would imply that the same should happen generally for all object properties, with whatever weird implications that would entail.

    a.foo = 1
    b.foo = 'there'
    a + b // { foo: '1there' } ?


What language has this behavior by default?


Fortran :)


> [1, 2, 3] + [4, 5, 6] // -> "1,2,34,5,6"

> What would you expect instead?

If I let my first instinct speak:

[1,2,3,4,5,6]

And then if I think a little more, then maybe:

[5,7,9] //with obvious caveats

In no way do I expect what GP actually provided.


I understand where you are coming from. But the addition operator simply does not have any special handling of arrays. The specification [1] clearly defines what it should be used for: "The addition operator either performs string concatenation or numeric addition." As JavaScript is weakly typed, it is the programmer's responsibility to use proper value types with these operators. That limitation (or advantage?) is also well-known and applies to all weakly-typed languages.

[1] https://tc39.es/ecma262/multipage/ecmascript-language-expres...


It seems that by "expected", you mean "expected, by anyone who read the spec" which I don't think is a fair use of that word. Obviously, most JS developers have not and will not read the spec.

I am very happy with JS and TS and I think the coercion rules are easily worked around with linter rules and policies, but they are definitely weird and I think the language would be better if it simply threw exceptions instead. But then, such an issue shoud not be surprising for a language that was designed in 10 days.


No, I meant "expected by anyone who learned the language". Knowing the addition operator including its limitations is quite basic. I'm not saying you need to be able to solve all this "JavaScript is weird" puzzles as they are mostly non-sense. But you definetely have to know what you can us `+` for.

If someone does not like the ECMAScript specification, that is fine. But at least use a proper unofficial documentation like MDN.


well I guess the question is then not just what would you expect instead but at what familiarity with the language should one be asking people what they expect of it?

If you ask experts it is because you want to get an actual correct answer, but if you ask neophytes it is because you want to get an answer that might be obvious even if not correct.


Perl would give you 9 for the equivalent expression of (1,2,3)+(4,5,6) :)


> Is there any use case where you would want to deal with sparse arrays?

Not really. Now explain why [,,,].map((e,i) => i) is [,,,] instead of [1,2,3] please ;)


(assuming you're really asking) It's because JS has a notion of array elements being "empty", and the map operation skips empty elements. Basically "empty" means the element has never had a value assigned to it, but its index is less than the array's length property.

    Array(4)               // [empty × 4]
    a=[]; a.length=4; a    // [empty × 4]
    Array(4).map(n => n)   // [empty × 4]
    [,,1,,].map(n => n)    // [empty × 2, 1, empty]
My go-to way of avoiding this annoyance is "Array.from(Array(N))":

    Array.from(Array(4)).map((n,i) => i)  // [0, 1, 2, 3]
Alternately there's a recent "fill" method, that assigns all elements (including empty ones) to a given value:

    Array(4).fill(1)      // [1, 1, 1, 1]


> [,,,].length // -> 3

I don't think this is too weird if you think about it. JS allows trailing commas, so the last one is ignored. Effectively this is `[ undefined, undefined, undefined, ]`. A syntax error would have made sense here, but the length of three is a result of the usual syntax rules, not a particular strange quirk of JS.


Sorry to be pedantic, but `[,,,]` creates holes instead of undefined. They are different, because for example `[,,,].forEach(() => console.info(1))` doesn't do anything, but `[undefined,undefined,undefined,].forEach(() => console.info(1))` prints three "1"'s.


I could also make a php is weird page and say:

"WAT $a .= "World!"; ?"


Yes. You could.


I am sure a reasonable person and educated computer programming person will be able to avoid these traps, by adhering to certain standards. However, you often have that other person on your team, who does not care about being careful and writes code like it is to be once written and never touched again. And that's where the danger creeps in.


Entirely orthogonal to these operator semantics being unreasonable, that person should not be on your team: replace them with someone who cares.


Someone had linked on here a website that showed the 0.3 thing in practically almost every programming language and their output. I wish I could remember the domain / url cause it is interesting to compare languages defaults. I know most languages have ways to handle it correctly.



Yes this is it! Thanks for that, I do appreciate that they show multiple approaches in each language to showcase which one gives you the desired result.


I know most languages have ways to handle it correctly.

Including JS - http://mikemcl.github.io/decimal.js/


There's a proposal to add a decimal type to the spec, too.

https://github.com/tc39/proposal-decimal

Seems like it's a long ways away from being available, though.


Those last examples happen when you have variables containing those values. Without strict type checking, it gets really hard to know in all situations (especially when you're pulling values from a service) what you will have in there. And if a service changes, you won't have a lot of warning in your client code. So yes, these kinds of errors are very common.


Knowing that the underlying representation will vary is CS101 material. It applies to almost every language because that's how the hardware works.


Hardware doesn't exist in any meaningful way for 90% or professional programmers anymore.

Way too much abstraction for that to be an argument.


I don't agree. Even if you aren't programming microcontrollers, even if you're just building brochure websites for mobiles, you'll do it better if you understand hardware.


> you'll do it better if you understand hardware.

Oh for sure. I agree. That's not what we're arguing though.


Also things like +!![] is because while JS is dynamic it does have a very strict type system! ![] becomes a boolean +true becomes a number.


JS has a very weak type system, full of implicit conversions. In dynamic languages with strong type systems, like Common Lisp, such operations typically result in errors:

  (+ (not (not '()))) 

  The value NIL is not of the expected type NUMBER.
Similarly in Ruby:

  +!![]
  
  undefined method `+@' for true:TrueClass (NoMethodError)
   [Condition of type TYPE-ERROR]

Interestingly, Python does the same thing as JS in this case, even though it is typically quite strongly typed. Edit: not quite the same, as the empty array is converted to False in Python, just as it is in Ruby (in CL, nil/'() IS the canonical false value); but still, Python outputs 0, it doesn't complain like the other two.


> Interestingly, Python does the same thing as JS in this case, even though it is typically quite strongly typed. [...] > Python outputs 0, it doesn't complain like the other two.

yep, this is one of those Python weird bits. in Python, booleans are ints, True is 1 and 0 is False. and i don't mean it in a JS-ish way like "they can be converted to...". no, True is the integer value 1. in fact, the type bool is a subtype of int. if you think of types as sets of values, and subtypes as subsets of their parent set, that `book < int` relation suddenly makes a lot of sense ;)

  >>> +(not not [])
  0
  >>> (not not [])
  False
  >>> 0 == False
  True
  >>> 1 == True
  True
  >>> True + True
  2
  >>> True - False + True * 0.5
  1.5
  >>> isinstance(False, bool)
  True
  >>> isinstance(False, int)
  True
  >>> bool < int
  True
so, if you accept that the operation `not []` makes sense to be defined as True (because `bool([])` is False), and that it makes sense for False to be the integer 0, then `+(not not [])` being 0 is just a logical consequence of that :)

for the record, i do think it's weird for Python to define bools as ints, and to make all values define a boolean semantics via __bool__().


Python was created February 1991, but it didn't have a boolean type until 2002 per PEP 285 <https://www.python.org/dev/peps/pep-0285/>

The rationale for making bool a subset of integers is for ease of implementation and substitutability (which aids backwards compatibility)47, as explained here:

> In an ideal world, bool might be better implemented as a separate integer type that knows how to perform mixed-mode arithmetic. However, inheriting bool from int eases the implementation enormously (in part since all C code that calls PyInt_Check() will continue to work -- this returns true for subclasses of int). Also, I believe this is right in terms of substitutability: code that requires an int can be fed a bool and it will behave the same as 0 or 1.

I have some Python code where there are still a number of uses of 0 and 1 for false and true, because it was written before Python added a boolean type.


This is a result of the implicit type conversion feature rather than a strict type system.


We have this on our interview test. It was the one almost everyone got wrong except for a couple people who ended up being really detail oriented and had deep knowledge (as opposed to broad).

We consider >70% passing.


I'd honestly consider these types of questions one of the poorest ways to test front end developers.

The only reason I have learned some of that oddball stuff with JavaScript is because of some job interviews e.g. when I was earlier in my career, I used to Google things like "top 30 questions asked in a JS interview", etc, but I forget it after that until I'm about to look for another job. However I wouldn't do this type of learning any longer, since I wouldn't want to apply for a company asking these types of questions.

In the end at work you should use ES6/TypeScript with linting, proper tests and these cases would never occur.


you're right in both points technically, but if that means that somehow these things don't make JS "weird" for people to learn/program, then i disagree

> that's a floating point problem. Plenty of languages will have the same issue.

yes, many (most) other languages do have the same floating point issues, but that doesn't make them less weird. JS numbers are IEEE floating point numbers, therefore floating point issues/weirdness are also JS issues/weirdness :)

> Calling these things weird is fair enough but I can't help thinking this is code you'd never actually write outside of the context of a "Look how weird JS is!" post.

the verbatim code snippets in particular, yes, you're completely right. but the underlying problems that these snippets exemplify are still there on actual "real" running code. they just look less suspicious on the surface.

> Just be moderately careful about casting things from one data type to another and all these problems go away.

true. but still, these things can happen, and when they happen, they tend to sneakily manifest as weird UI bugs, like rendering "NaN" or "undefined" on the screen (we have all seen those... there's plenty of meme images about them too), instead of noisily breaking with an exception, which is way more noticeable and actionable for us programmers when doing introducing these bugs in the first place hehe.

it's true that they are not the most common kind of bugs, i'll give you that, but when they happen, they can be incredibly frustrating in my experience, because you may only realize after they have been affecting user for a looong time (maybe years), but you just didn't know because JS decided it was a good idea to carry on after one of these nonsense operations like adding an array to a number, giving you nonsense results instead of useful (runtime) type errors.

story time!

i remember an ugly case of this which involved some search filters on a big-ish system. the JS code was quite generic in how it handled filters, and looked fine. for some time, all filters were single-value, as simple strings, but at some point the system started handling multi-value filters, which were encoded as arrays of strings. well, the programmers who implemented the UI for multi-valued filters just tried sending array of strings as filters to the existing filtering system, and it seemed to work correctly in the results it yielded, so they assumed the code was prepared for that too (it looked generic enough), and so they shipped the feature that way.

it was only years later, when i was porting some of that code to typescript, that typescript complained about invalid type operations. i was converting between languages pretty willy-nilly and assuming the existing system worked correctly, so i suspected typescript was being dumb with that type error. but no, it was actually complaining about an actual bug. when passing arrays of strings as filters, the underlying filtering system was at some point coercing those arrays to strings by accident. so ['apples', 'oranges'] became 'apple,oranges'.

the search results were always right when the multi-valued filters had only one value (because ['apples'] coerced to string is 'apples') and kinda right in some naive cases of multi-valued filters (because of how the text search engine worked), which was probably why the original programmers thought it was working correctly and didn't give it much more though or more thorough testing. but the search results were definitely not correct in most non-naive cases of multi-valued filters.

so our users had been affected by this bug of multi-value search filters basically not working for years, and we only discovered it by accident when porting some of the JS code to typescript because typescript was kind enough to tell us about this nonsense operation statically, without having to even run the code. i wish that JS were also kind enough to complain about these nonsense operations, albeit at runtime. it would have made this problem obvious from the get go, and would have prevented the original programmers from shipping a broken feature to thousands of users.

plot twist: although the story may make it seem that i was the competent programmer that figured it all out when porting the code to typescript, in reality "the original programmers" also included me (probably, i don't remember really), just some time before the typescript port :/


In C# you don't have such a problem and 0.2 + 0.1 is exactly 0.3. Proof:

https://dotnetfiddle.net/qTiq6U


In every language where you can use a decimal the result will match exactly. That's... not what's discussed here and it's not a default in c# either.


Unlike other languages, in JS you don't have decimals... So you stuck writing a garbage code by multiplying all floating point numbers by a factor to avoid rounding errors.


The last couple generations of POWER chips from IBM have implementations of decimal32, decimal64, and decimal128. To my knowledge, no other big-name, general-purpose CPU has these.

To "implement decimal numbers" in .net on x86 hardware simply means writing a custom decimal implementation in software.

JS had implicit integers since the beginning. It has had arrays of integers (what's necessary for fast decimal implementations) since BEFORE the release of WebGL a decade ago. It has also added BigInt this year. Just like .net, there's decimal libraries available if you know to use them.

https://github.com/MikeMcl/decimal.js/

The real takeaway is that modern processors should definitely add hardware support for decimal numbers.


I don't think that's fair: C# has decimals but everyone uses floating point numbers by default so C# developers still need to know that 0.1+0.2 != 0.3


Exactly. Finding flaws in a language doesn't mean the language is bad, just that it has... flaws. The proof that JS is actually pretty good is that it has been used to build so many things. Like the old adage about economics, these criticisms are taking something that works in practice and trying to see if it works in theory.


Just because something is popular isn't reason for it to be good. Examples: fossil fuels, (over)fishing, rage-based engagement, hard drugs.


On the contrary - the number of people taking hard drugs to get high is fantastic evidence that they are good for getting high. Otherwise, why would people be buying and taking them?


People use hard drugs to cope with their problems, and they're rubbish for that.


Uh Fossil Fuels powered the Industrial Revolution and literally created the modern economy.

It's true that being popular doesn't necessarily make something good. But it's also true that having some flaws discovered later doesn't make something that revolutionized the world and massively uplifted the standard of living of basically everyone in it bad.


Just to add to your list: Ed Sheeran


Excepting fossil fuels, none of those things are popular.


No. Popularity does not indicate quality. Judging the quality of something by its popularity is a form of cyclical reasoning. Especially when there is an obvious alternative explanation, namely that JS is the only language you can in the browser without transpiling and for a long time the only one period.

While the fact JS can be used to build all these things puts a floor on its quality, that floor is uselessly low.


I'm not using popularity as the metric. "1 million identical websites were built with JavaScript" wouldn't say much beyond the first website.

But that there is a very broad range of successful applications partly relying on JavaScript for their success undercuts the idea that JavaScript is inherently rubbish. Whether it has some subjective "quality" is a conversation best left for art galleries.


"The proof that JS is actually pretty good is that it has been used to build so many things."

[French Narrator] 5 minutes later...

"I'm not using popularity as the metric."


If you say JS is "pretty good" how is that not a statement about its quality?


Fair point. I guess I could rephrase it as "JavaScript provides a lot of value".


Judging by the title, it sounded like the biggest discovery of 2021... by someone waking after a long coma. It was discovered in 2012: https://www.destroyallsoftware.com/talks/wat

I really enjoyed the questions... and even though I am working in TypeScript, I got only 9/25. Precisely because I avoid f---ed up parts of JS, except for trivia games (https://dorey.github.io/JavaScript-Equality-Table/unified/, http://www.jsfuck.com/).

Some parts of code need not to be understood. They need to be blocked by types, tests, and linters (https://p.migdal.pl/2020/03/02/types-tests-typescript.html).


https://xkcd.com/1053/

There are a lot of people being born every year. If a 16-18 year old teenager is getting into web development, they were 4-6 when that talk was given in 2012. Sure, that talk still appears here and there, but it's past its prime and you'd have to be in the right place at the right time to run into it.

Crockford's book (JS: The Good Parts) was ubiquitous among JS devs who were coding between 2008 and 2015 (give or take). These days, younger devs have never heard of it. Due to the Seinfeld Effect, if they read it, it would seem trite and obvious not realizing how revolutionary it was at the time.


Not to mention just because I was alive in 2012 doesn't mean I know or knew about this.

The parent comments floating to the top here seem to have a common theme of "I knew that already how did the rest of the world not also get the memo at the same exact moment I did?"


It's amazing how much of a positive effect that short comic has had on my life.


I know this xkcd strip, and love it by my heart.

Though, I hope that now people learn JS without its worst parts.

In any real codebase, expressions such as `true + ''` should not be understood. They should be eradicated.


if you block JS you get this message

> This website is literally about JavaScript. I mean what did you expect, a .NET application? This website is 99.9% poorly optimized and highly questionable JS. And yet, you have JS turned off.


If you open the console to check after you're finished, it says:

"What, are you going to check if the results are accurate? Go ahead "

But if you open it console while doing the test, it says:

"Hey, stop cheating! "


Why the hell can websites tell that I'm in the console? If I want to inspect some parts of a website (perhaps save an image), then I don't want that website to interfere.


They don’t. They just use “console.log”, you see it because you opened the console. It’s not like they know you’re there and then write a message.


Devtools can actually be detected to a certain extent - https://github.com/sindresorhus/devtools-detect


> Doesn't work if DevTools is undocked and will show false positive if you toggle any kind of sidebar.

This annoys me every time because I use Tree Style Tab, and a website tries to detect DevTools. See Samy Kamkar's website [0].

[0]: https://samy.pl/


Seems like this guy is just using a width a height difference to detect this. Also some ‘window.FireBug’ property, that doesn’t turn up much currently in chrome or ff console on Ubuntu


This entire project should be copied verbatim into browser bug trackers, imho.


While there are some ways to detect having devtools open[1], this website isn't actually using them. Instead, it just always writes these messages to the console. Before navigating to the answers, it clears the console so you don't see the messages that were logged before.

[1]: https://stackoverflow.com/a/7809413/451847


This answer how facebook does (did?) it and why is also very interesting: https://stackoverflow.com/a/21693931


Not used in this case but if you open the console docked (which is the default as opposed to a separate window) then the resolution of the window will change.


Noscript page also includes:

<script>var answers="I SAID STOP CHEATING! >:( You will get answers and explanations at the end, you dang dastardly dangusson."</script>

I love sites that have easter-eggs in noscript/console. Noscript I can see being annoying (i.e. instead of showing content), but there are some exceptions like this site.


People may want to watch "Wat" [1] from 2012 ( for a laugh ) before doing this quiz.

[1] https://www.destroyallsoftware.com/talks/wat


Good talk!


Wasn't that where the "wat" got started?


Yes, it was the proto-wat.


I have plenty of complaints about Javascript, but most of these aren't it. I haven't done any serious javascript in something like 6 years, and I still got most of this right without much effort.

#4, #13, #14, #21 and #24 are floating point problems and are nothing to do with JavaScript.

A bunch of them are just "learn the language" issues:

#2 I only got wrong because I never noticed JS allows trailing commas. Trailing commas are pretty damn common in most modern languages though. The syntax for empty initialisation of arrays is weird (and I'd argue is a mistake), but the behaviour is exactly what you'd expect.

#5 is the comma operator as also seen in C and C++.

#8 is something nobody should get wrong. Three nots in front of a true is false. Of course it is.

#10 is just an octal literal. They're becoming increasingly rare in modern languages, and were never quite as common as hex literals, but they're also not terribly obscure.

#15 is one I'm slightly ashamed I got wrong. It makes no sense to use increment operators on a bare value. This is common to just about all languages that support increment operators (and it's those, if any, that don't agree are in the wrong)

#22 is a slightly surprising counterpoint, but it also makes sense. I don't know of a language that has a NaN literal, and JavaScript is no exception. So that NaN is actually a global that is bound to the floating point value NaN. Because it is a global, it can be incremented (but incrementing NaN is still NaN).

Everything else is a type coercion issue.

#18 and #23 are coercions I really disagree with. Coercing things to a number shouldn't produce NaN. That truly is crazy. Throwing is the only reasonable behaviour here. Ruby is even worse than JS here ("true".to_i produces 0), Python gets it right.

Other than that, the coercions are all fairly make sense, and are consistent with what other languages do. Now, implicit type coercion (especially under the guise of abstract equality) is a terrible idea because it's easy to do it by accident and get nonsensical results, but the coercions themselves are, by and large, pretty sane.


> #2 I only got wrong because I never noticed JS allows trailing commas.

No one used to risk it because IE was inconsistent with what it meant.


I’ve taught programming to people who struggled because they were misusing JS and the answer they were getting back coincidentally worked for some inputs but not others.

Imagine trying to explain to them that this was their fault? Even trying to find the words to explain what happened made me feel bad that their first experience programming was tarnished by a language that will gladly allow you to do nonsense.


Can it be that JS is not good starting language then?

Maybe strongly typed compiled languages would be a better fit?

One can argue that if you want to program it might be too much information for starters - but IMO basic types are such a fundamental concept that it would be better to teach that concept early on.

It also clears up a lot of newbie confusion as you cannot assign string to an int by mistake with strongly typed language. Such scenarios are then handled by compiler so newly starting dev has quick feedback on fundamental issues with his code without having to ask people around.


> Can it be that JS is not good starting language then?

@ozim, I don't believe you are trolling!

JS is an absolutely awful language for beginners; it's a 155mm howitzer pointed at your foot. Although you can get the hang of the basics quickly, AND it runs in the browser (super-convenient unless you're headless), those "basics" that you thought you'd got the hang of turn out to have semantics that make sense, but are not what you expected.

I wrote JS (about 0.2 of my time, I guess) for my job, for about 20 years. I got 18 out of 25 on that test. Now, I think it was actually an easy test; I got a third of my answers wrong because I haven't learned about a third of JS semantics. I'm either not very good, or not as smart as I like to think.

A certain recent colleague of mine would probably have got 25/25; he was cleverer than me, and he was REALLY into Javascript.

Try a Pascal-type language, to learn an imperative-type language. Modula2 is better. Learning Prolog is (I think) purely declarative, so it gives you the view from the other side of the curtain - as it were. Forth is a perfectly-feasible beginner's language; it was the second language I learned.

I can't recommend any OO language for beginners. Not because I think all OO languages are way too difficult, only all the ones I've used. I haven't used Smalltalk.

I haven't used any flavour of Lisp. That language is so pliable that I'm sure there's a flavour that would be good for beginners.

[edit] Actually, I wish I had learned a functional language like Lisp as my starting language; I strongly suspect that would have made me a better programmer throughout my career.


Having designed programming curriculum, I have to disagree that the language is a bad language for beginners (though it is very far from ideal).

To be fair the primary reason is not really a virtue of Javascript per se, but the fact that it runs in the browser is a large pedagogical gain that no other language can easily match and is hard to overstate. Learning to program is overwhelming for most students, and setting up your environment is a major component of that. Being able to run your code natively in the browser is a huge boon. Even better, having that code immediately be useable in a very real sense boosts motivation significantly. For most students it really is preferable to let students get very comfortable with basic code constructs before having to interact with the underlying system in any meaningful way.

At the language level, Javascript also has some nice affordances that let you treat constructs as simpler than they are in your curriculum and expand on them later. I think the best example of this is `var`. Viewing variables as uncomplicated "buckets" that happily contain any data you want to shove in them is a good starting place when introducing the very first building blocks of something like if/else control flow.

Most languages are built with semi-experienced programmers in mind and are ill-suited for many beginners and actively bad for children. As an easy example: as a developer I like seeing type specifications for numbers, but pedagogically having to introduce the difference between ints and floats on the first day is just empirically confusing for most students. Much better for them to be "magic" until you can properly motivate them to care (which can be just a few weeks in!)

I could write on this topic far longer than I have, but the short of it is that Javascript has done a truly surprising number of things right for a teaching language. I'd still love to remove some of the zany examples here if I could though.


For starters it is maybe great, but I see problem with types that are not explained but create problems when building something more. Because in the end `var` is not just a bucket and adding strings to number bites people.

It is like learning how to snowboard, I am now on a level where I can go downhill quite quickly, but I don't have enough experience to not crash if something unexpected pops up.

Learning with `var` is just a bucket seems to be that it is going to cause a lot of frustration and pain later on when person thinks he gets it.


Of course these technicalities can be a problem, but our conversation here is about absolute beginners, sometimes children engaging in a structured curriculum where the problems they face are ones in our control. In that context it's ok to oversimplify if we

1. Tell the student we're oversimplifying, and maintain their trust in us so that they'll follow along when we do so.

2. Revisit oversimplified explanations when they have enough knowledge and context to understand more detailed and correct explanation.

The point I've been making is that Javascript makes it easy to pace our explanations of complex programming topics while still allowing students to consistently writing working code. Most languages require a large amounts of either upfront understanding or faith in "magic incantations", both of which hinder learning (this is not a criticism of those languages, they were designed for professionals). So Javascript still manages pretty admirably, all things considered.


I wonder if Typescript would make a great first language, since a subset of it runs in the browser, but the IDE and compiler can prevent some very bad bugs.


It might be a good extension after they'd spent a good amount of time in plain Javascript, 50-200 hours in depending on age IMO (younger -> more).

Adding types is, again, confusing to most beginning students and adding any sort of build process is asking for tears.

The bugs, while unpleasant, just aren't actually that bad pedagogically. Students readily accept explanations of the form "computers are dumb and do what you tell them, if you tell them to do something that doesn't make sense they will do something that doesn't make sense" which is close enough to correct to not give them a terrible mental that is difficult to unlearn later.

Additionally, at this stage of learning students should be writing many small programs. Such programs are typically easy for teaching aids to debug in seconds (or to suggest a radically simplified re-write).


I am not trolling just a honest question based on parent poster experience.

I agree OO is too much for any beginner. Getting started with procedural thinking + basic types is in my opinion a requirement.

I agree nice parts about starting with JS is that basically only things you need is text editor and a browser.

Unfortunately data types will show up one way or the other so there is no need of hiding those. Even in Excel creating formulas one has to understand text vs number and difference between '.' and ',' and how operating system configuration make sometimes '.' to be decimal separator and sometimes ','.


The problem with strongly typed languages is that it makes it infuriating whenever you have to work in a language without the same type guarantees. JavaScript made me want to throw my computers off the top of a skyscraper until I discovered typescript, and typescript just barely makes things slightly tolerable.


Yeah, pretty much everything you mentioned was made clear to me after this.


This is why Pascal and COBOL were the CS101 languages at my college. (RPI class of '85)


I really think we should stop pushing things like that or the wat talk (https://www.destroyallsoftware.com/talks/wat). Sure it's funny and all, but it's not helping people understand. We're presenting Javascript as some kind of stupid black magic tool, which is not. You can very easily find the standard online, and go through the examples to understand. Sure it's unintuitive, I agree, but lots of things in programming are unintuitive, and when you can't change them you have to learn them.


I don't even think it is unintuitive. There are very few things here (if any) that make no sense, and a few are literally just basic understanding of floating point spec. And then of course a few questions are basically obfuscated-C levels of code, where you won't get the answer correct unless you get your pen and paper out.

I'd argue that if you don't know about type coercion in JS, you don't really know JS. It's a feature; none of these operators will throw; it's a conscious design choice.


Probably unpopular opinion: Most of these are just the weirdness of the language caused by the (not-so-wise) design decision that those simple operators should never throw. In my opinion, these are not footguns and should not be used to attack JavaScript, because regular programmers are very unlikely to run into them (which is why when presented, they seem so obscure).

However, that is not to say JavaScript is without footguns. For example, the classic `[1, 2, 3, 10].sort()`.


JavaScript sorting numbers alphabetically is a fact I simply cannot get over. I know I should be pragmatic as an engineer but this goes against the core of my being. Awful.


Since it’s (imo) default-wrong in English, I’ve chosen to think of it as a nice way to point devs to something that’s locale-aware like Intl.Collator instead. If it “just worked” in English, most of us would never think to use the correct utility instead.


This is a bit of a tangent, but that makes me wonder… are there any languages where this particular collation would be correct?


It also alters the original array, just awful usability.


Most of these edge cases should result in clear errors. Chances are, if you're adding two arrays somewhere in your application, something else has gone terribly wrong, and just telling the programmer "this makes no sense" would be a much more sensible solution than returning garbage and letting it cause problems somewhere else, further from the real cause of the problem.


> these are not footguns and should not be used to attack JavaScript

No one is calling them that except you. The site says it right at the front:

> most of this syntax is probably, and hopefully, not something you use in your daily life.

Your obsession with having to choose Side A or Side B won't let you see this for what it is...

Just showing weird syntax in JS. That's all. Good lord.


I am not choosing a side here.

However, deciding which language to use for a certain task is a legitimate choice that people to make. When making such decision, people need to analysis the pros and cons of a language. For countless times, I have seen people claiming JavaScript is bad/inconsistent/weird for reasons like `"1" + 1 === "11"` but `"1" - 1 === 0`. All I am saying here is things like these should not weigh too much when deciding whether to use JavaScript.


the title is "javascript is weird", not "here's some weird syntax in javascript"


I have limited knowledge in web development and its history, but why is javascript a first class language in web dev when all I heard is "javascript bad"? Web assembly seems like a much better choice in hindsight where you can choose different language to compile down to wasm.


"There are only two kinds of languages: the ones people complain about and the ones nobody uses." – Bjarne Stroustrup

JS may be weird, but the kind of weird that works well enough and is rather easy to pickup and maintain with reasonable rules in place. When it comes to JS my empirical experience has often been: most of the time, the issue is situated somewhere between the keyboard and the chair.


GMail managed to be popular thanks to the XMLhttprequest() function being implemented in internet explorer. Microsoft had a monopoly on software, but it was still possible to do things through a web browser, enabling competitors and websites running on linux.

Javascript got popular because it was here first, so developers used it and became able to work with it. When this happens, javascript had inertia which is impossible to stop. Javascript allows one to deploy anything on any platform with a web browser, without copying files.

Webassembly is great, but it's not easy to build WASM files, the toolchain software used (compilers linkers etc) are not mature (only rust can build to WASM natively), and it requires that all language compile to wasm, so compiler developer need to implement a WASM compile "target", which takes time and is not always possible depending on language (python comes to mind, because of its large library, global interpreter lock, etc).

Also, WASM doesn't have access to the DOM or webGL, meaning that you still need to make JS calls to interact with a webpage.


> Javascript got popular because it was here first, so developers used it and became able to work with it.

And you could easily copy and learn from scripts before minification became the norm. And you just have to refresh some page after updating your sources to see the result.


Sure, but it's difficult to make newcomers understand why the language still has those nasty behaviors.

In essence, it's very expensive to risk losing backward compatibility or to make large portions of js software obsolete just to remove some bit of language ambiguity.

It's very frustrating but it's true for all languages out there. Same concept when linus torvalds yells "YOU DON'T BREAK USERSPACE". Backward compatibility almost has its own philosophical chapter on software design.

Also remember how painful it was from switching from python 2 to 3. I guess a solution would be heavy usage of linters, typescript or other compile-to-js solution, but in the end, a lot of developers are just wishing very hard for better solutions.

Personally I am really not willing to become a professional JS dev. I'm too perfectionist and nitpicky to have the courage to suffer so much for such thing. Deep down I know it's a bad choice, but I'm too lazy.


WASM support hasn't been around for long, comparatively - and I'm not sure if there's been any recent headway with DOM manipulation or GC from WebAssembly.


Webassembly is too recent. It needs to catch on. The momentum behind Javascript/Node is huge beyond comprehension. The engineering that has been poured into the JIT is also quite something. Just scrapping all of this and starting again with Wasm is not going to get traction from anyone.


Can you explain “Javascript/Node”? I thought Node.js was a JS interpreter but not the interpreter used in the major web browsers. Rather, that Node is an interpreter that’s used to run Js in (typically) headless (server) environments, allowing both sides to be written in JS. Is that accurate?


> Node.js is a JavaScript runtime built on Chrome's V8 JavaScript engine.


WASM is not a good solution in a lot of situations. In short:

- you have to package your complete runtime in the Wasm binary, this leads to a large binary and memory footprint

- DOM manipulation requires calling Javascript code, and calling JS from WASM is costly


You only need to package your runtime if you’re using a language that requires a runtime :)

Languages that perform their own memory management (C, C++, Rust etc) will be extremely small. Just one big array of bytes/instructions.


you still need to provide their standard library, at least the subset used by your program


That’s the same for everything, no? I assume most JavaScript websites come packaged with their (transpiled, minified) versions of the “standard library” of whatever framework they’re using. And that has to be transmitted as gzipped bundles of plaintext javascript.

Further, tree shaking is much harder with a dynamic language.

My understanding is that a WebAssembly implementation would usually be smaller than a JavaScript one, assuming you’re using a language without a runtime.


In general, a scripting language for a software product (a “browser” being one of them) may be easily substituted in most cases. Because this integration always has a clear API border than can be retargeted to another scripting language in a straightforward way. E.g. vim text editor has vimscript as its main scripting environment, but may be compiled (in addition) with perl, python, lua, tcl, ruby interfaces.

But guys who control web standards resisted for decades to suggestions for other languages, though it was fine to have flash/activex for the same period of time, until Apple killed it for good. Even JS “2.0” (a theoretical better but incompatible version of itself) had no chance, because reasons.

Javascript is okayish generally, but its unusual parts come from the times when it was used as a glue between textual inputs and some~ data structures.

Stroustrups’ phrase is just a stockholm syndrome (in his case self-induced).

Edit: webassembly support is not really required to transpile anything to a browser, see asm.js


Modern Javascript, and especially compiled-to-javascript languages, like Typescript, Reason, Elm and others, are quite great. Those quirks are funny and weird, but it's not something you actually run into when you write code that isn't actively trying to show the weirdness of javascript.


If you have limited exposure to JavaScript, that’s great. Try to keep it that way. Personally, the article/quiz reminds me what an utter shit field web development really is and that shaving my face with a lawn mower and hot sauce might be the preferable alternative. Also, reading the sibling comments here describing JavaScript as having characteristics like “great defaults” and “reasonable rules in place” make me wonder what fucked up alternate reality they’re all living in where those are true.


Most real world non-JS frontend code compiles to JS, not Wasm. Lots of apps written in ClojureScript, TypeScript, Elm etc. These are mature and Wasm has mostly only disadvantages to offer.


Using WebAssembly currently means programming in C, C++, Rust, or maybe Go. Despite its flaws, many people prefer programming in JavaScript over any of those lower-level languages.


Those are fortunately not the only options, there are many nice HLLs around that compile to JS.


I think a lot of these errors can be fixed with using strict types, aka Typescript.

I've had the same issues with PHP in the past too, which makes me believe that whosoever created these languages had ease of use as a higher priority than strictness, i.e. it's okay to assign bool, int, string to the same variable.. just get the job done.. okay.

On the other end of the spectrum is a programming language like Pascal (or object Pascal which I used for some time) where I had to write so much boiler-plate code that it made a simple job difficult. Maybe good for big projects but checking if email is valid before form submission, it may be too much boiler-plate code for some people with such traditional languages.


I had a similar feeling recently using Swift for a project.

Pure JavaScript is definitely a bit too loose for my liking, but I thought Swift went a bit too far in the other direction.


I had that experience with Swift for maybe the first 2 days I used it. Nowadays I really appreciate Swift’s strictness. It just helps prevent so many stupid mistakes.

I worked on a high-profile project for my company under a strict deadline and Swift was a major part in ensuring we delivered a rock-solid implementation in record time.


The only part of the type system that I feel gets tedious is dealing with numbers, where you're continually casting back and forth between Int, Float, Double, CGFloat (at least the latter is being addressed). I get why it's strict about it due to loss of precision and the madness of floating-point numbers, but sometimes you just don't care


The number things isn't that big of an issue once you get used to considering what type your numbers should be in advance, instead of going back and forth between number types, pick one that's appropriate and stick to it.


strict types and Typescript are not synonyms. Perhaps you meant to write “for example” instead of “aka.” And furthermore there tends to be some disagreement about the meaning of the words strong/weak, static/dynamic, strict/lax when talking about type systems. Indeed Typescript is quite a weak typesystem in that type checking does not very fully guarantee useful correctness properties about your program because it allows for missing type annotations.


I love the HN comment section — one of the best I’ve seen on the web. But nitpicking this use of a.k.a. doesn’t add anything to the conversation. We all know what they meant, since virtually no one thinks that TypeScript is the only strictly typed language in existence.


The questions covering floating point are not really weird, or rather their weirdness is not specific to Javascript.

It's a fun quiz nevertheless, at least for someone like me who is not very proficient in Javascript.


Especially weird was that the quiz stuck at question 2 and kept resizing and shifting the position of the question text on selecting any answer until the quiz ended.

CSS in the console is a nice touch though.


> CSS in the console is a nice touch though.

Ty for bringing it to attention. Indeed a cool feature to create emphasis and structure. Haven't bothered with it yet but will consider it. Another cool one is using console.groupCollapsed (as long as you are staying in a callback context).


I got stuck at question one like this. Running Tor Browser latest version, tried in all security modes.

First time i ever see a form behave like that oO


It doesnt work for me in Firefox at all.


Why are people wasting time on sites like this?

Every programming language has some weird stuff in it.

If you are writing code like that then you are the problem not the language.

Is this just for the clicks? Is this developer click bait?


I took the quiz and enjoyed it. Just have some fun, I don’t think the quiz wants to convey any deeper message.


^ This! It's "learning content" for tiktok-devs I guess.


These examples are amusing but irrelevant.

What I find sorely lacking is sane syntax for working with arrays. Eyeing Python with envy every time.

And some pattern matching would be nice too.



Special syntax for specific data structures are exactly why I hate so many languages. No. We don't need more of that garbage. Map & co. do everything you need and adding yet another syntactic construct on top of it would just break the language even more.


Nobody programs like that, so it's not an issue in real-world applications.


I would agree if wasn't for the fact that JS is a dynamically typed language.

You can (and sometimes will) run into instances of this without noticing, because while you won't explicitly write

  true++
you might do something like

  x = someFunction()
  x++
not realising that someFunction() might return a boolean under some circumstances.

Anyone who worked with a sufficiently large JS codebase ran into one of these cases and got unexpected results at some point due to this.


true++ and x++ are syntactically different though. The latter is a post-increment of an identifier, and the former is a post-increment of a boolean literal. Different rules may apply to these cases.


This is true, one must always make sure to handle all possible return types, but that's true for all dynamic languages.


Nobody uses straight edge razors for juggling, so they're not dangerous when shaving.


I use JS since almost a decade and only got half of them right. Specifically I failed at the last one because I got impatient, the octal one got me too (never use those in JS), the true++ and the ones that require you to know that arrays are converted to strings when used in arithmetic.

I'll just externalize my embarrassment and say that JS is indeed weird! Another excuse is that I typically try to avoid implicit type conversion except for idiomatic "tricks" such as using !! to convert to a boolean.

Cool website though!


I have been using JS for almost 10 years, and have written quite a few big apps with it, both front- and back-end, but only a handful of times have I experienced this “weird” issues. Nobody writes code like that, I only know of these things because there have been many websites like this over the years.


Eh. I'm not a JavaScript developer, just winging it I got 9/25. Seems like there are 3 broad categories of (my) confusion. 1. floating point. Floating point is tricky, and when I use it seriously I have to look up a lot of special cases. I often forget I can get Nan, and it'll infect everything it touches. This is common in just about any language.

2. One or two of those , questions got me. I don't _hate_ it, but there's a little frustration there. Mostly I just need to know the syntax. If I was writing javascript for money, I imagine it would bite me once or twice, then I'd just memorize the rule.

3 (the biggie) implicit conversion. Oof. seems like a lot of types can be converted to other types, and it's not obvious to me what the precedence hierarchy is. With C I have to look up the edge cases around signed/unsigned, but usually it's just use the next bigger one int -> long. Java does the .toString for everything which has bitten me once or twice. I really don't have a sense of javascript implicit conversion. Some of those seem really subtle.


I stopped at the first example (true + false).

If you write code like this, you have bigger problems than using JavaScript. For most of the examples, it's like saying "Here's what happens when you plug a fan to a faucet". There's no point really.

JavaScript is weird just like any other language can be weird when you use it in weird ways. And yes, it's certainly one of the weirdest.

If you want to talk about things weird in JavaScript, let's talk about ES modules in Node.js and the Browser, or the number of decisions you have to make to bootstrap a simple web app in JavaScript, and how a language written in 10 days is shipped to billion of devices today.

But, this kind of content, (while well made technically) is always the same one about JavaScript.


Not really, it's a nice shorthand for "exactly one of these must pass":

  x = checkFoo()
  y = checkBar()
  z = checkBaz()
  if (1 === (x + y + z)) {
      ....
  }
It's a rare construction but I have used it intentionally once or twice in the past decade.


> I stopped at the first example (true + false).

So you never did something like this:

  x = performCalculation(getParam1())
  y = performCalculation(getParam2())
  sum = x + y
only to discover that "performCalculation()" sometimes returns a boolean instead of a number? JS is a dynamically typed language after all and functions like

  function f(x) {
    if (x >= 0) {
      return Math.sqrt(x)
    } else {
      return false
    }
  }
are perfectly valid. If you use a 3rd party library that uses return values like this, you might run into such case without realising it.

Sure, you won't explicitly write "true + false", but "f(x) + g(x)" is not uncommon and might indeed evaluate to "true + false".


some people will be quick to say "oh but then the library is just awful". Yes. Yes it is. Sometimes one has to work with brain-dead stupid libraries, and in those cases, it'd be much easier if JS would just give you an error saying what happened: You're trying to multiply apples with oranges.


I've not run into that kind of issue in years. Read the docs of the function you're attempting to call.

Also, in this case, the function should return NaN so it's not polymorphic.


> Also, in this case, the function should return NaN so it's not polymorphic.

Sure. But what if that's function comes from a 3rd party component that doesn't have great documentation or relies on another package and thus a behaviour like this simply bubbles up through the call chain?

And don't think this can't happen - NPM in particular notorious for this kind of deep dependencies. Also mixins and monkey patching are a thing in JS, so just importing a module can lead to an unexpected change in behaviour.


Typescript to the rescue ;)


NaN++ is sometimes a TypeError in environments that define NaN as read only, that one’s sometimes wrong.

The float examples aren’t weird in JS, they’re the same in all languages that use IEEE 754 floats, which specifies that NaN != x, where x is anything.

“This is due to a decision made by the IEEE-754 committee for a few reasons, such as space efficiency and the fact that the function isNaN didn't exist at the time.”

This explanation feels like it’s trying hard to downplay the reasons and make it sound arbitrary and almost whimsical. It doesn’t make sense to allow NaN - NaN = 0 (Note that Infinity - Infinity == NaN), and it’s important that NaNs used as input to computation produce NaNs as output (other than when using boolean tests).


Yes Javascript is weird but at the same time you almost never face many of these issues in the real world. Bad things exist in all languages, for some maybe there isn't the same in the sense of actual quirks in the language but rather other things that limit the language in some way.

I have been a web dev for about 10 years now and I rarely if ever run into these issues. More likely, I face issues and bugs regarding state and just flaws in the logic rather than issues based on that the language wasn't designed so great.

For sure this can be worrying if you do something really important but at the same time you have a lot of other languages to choose from.


Fortunately nowadays including any of those in your code would award you the review equivalent of bashing your head in.

With the advent of linters most of these patterns started being rightfully considered bad form by default.

Currently even innocent stuff like +new Date() (outputs a timestamp) is already something that I've rarely seen make it through review.

JS is goofy, but ES2015 managed to avoid making the same mistakes. Shame it took so long to implement it.

At the moment the bigger problem is the painful transition from whatever module system you had to ES Modules.

Especially browser<->Node.js interoperability suffered massively from this.

Hopefully there will be no further standards in this field.


I am eternally surprised at how long it took node to support ESM. CommonJS modules served their purpose, but they are about to go the way of JS module systems. This is where the appeal of deno comes in. It's a runtime that tries very, very hard to give you the same API surface as that of the browser. I just hope deno catches on faster.


I consider Deno to be DOA - hopefully I'm wrong.

The scope was just too broad.


Ironically I got a bunch of these wrong because JavaScript is less weird than I thought it was. For example for some reason I thought ![] would be true, because JavaScript would do something weird, but it's not and JavaScript does exactly what you'd expect any sensible language to do.

I didn't enjoy the quiz, it just reminds me how depressing the base layer of the thing I'm most passionate about in life is. The expression [1,2,3]+[4,5,6] returning what it does is just a bummer, it just sucks. Calling it weird is giving it undue merit, as if it's cute somehow that it's so awful.


Website content on a web browser:

> This website is literally about JavaScript. I mean what did you expect, a .NET application? This website is 99.9% poorly optimized and highly questionable JS. And yet, you have JS turned off.

Yes, a website can be about Javascript and still be a collection of documents in web standards, like HTML+CSS, possibly with optional interactive features that could be implemented in Javascript.

I mean, what did you expect? That it is impossible to edit a book on paper about Javascript because obviously paper cannot run Javascript code? Even I have books about Javascript, and they render fine.


I'm actually pretty fond of JavaScript, but here's a question that has caught me out in real code:

    const 
      x = [1, 10, 2],
      y = x.sort();
Now: what is the value of x? (Edit: and y?)


Is the y relevant? Sort is in place and without a comparator function, it's going to convert the numbers to strings, so the array will remain the same.

If the question is about const, const with Arrays/Objects are by reference rather than value, nothing is stopping them from being modified later in the code.


In JavaScript every possible value has a .toString() representation* so it makes the most sense that the untyped/mixed array .sort() does lexicographic sorting. If you need numeric sort, use a TypedArray or provide an explicit arrow function.

* Except Object.create(null), which throws.


Yes, .sort((a, b) => b - a) is what I meant in this case.

I think perhaps this "makes the most sense" from a language designer/implementor's perspective, but it also gives ample opportunity for WTF moments in actual use.


I've been using JavaScript for around 15 years, and I've never used a comma to separate expressions. And I'd still go to MDN to look up if sort() is in place or not :p


The comma is not material here: the answer is the same if we write `const x = ...; const y = ...`.

And that's the first gotcha: sort() is in place.


Pretty sure I’ve actually tripped on sort() myself before by assuming it was a new array. It does make sense for performance though, creating a new array would be a bad default choice for performance, for the same reasons that forEach is generally faster than map all else being equal.


True. I like Ruby's solution with exclamation marks, such as `sort!`, to make this kind of behaviour obvious.


I guess there's a lot of people who don't work with JS and learn about it from memes like this, so I feel like I have to clarify: I've been writing Javascript (well, actually Typescript, as any other sane person) professionally for quite a few years now, and I can count the amount of times I've actually been bit by these quirks on one hand. And if they added Array.sort() and MAX_SAFE_INTEGER (NEVER serialise and deserialise Facebook's and other long digit IDs as numbers), it's still better than most of the alternatives, in my opinion.


Based on the comments half of the people didn't read the introduction.


UX nit: would have preferred question-by-question feedback on right answer and overall score, quite a lot to get through 25 questions before you get any reward for your effort :)


Same - it was clear to me by question 2 that I was going to be guessing on the rest of the quiz. Having to fill out the rest before getting feedback was a formality at that point.

By the time I saw the answers, I couldn't remember why I guessed which way on each question.


I don't think it is a UX nit. You would have started to catch on and learn if you had immediate feedback.

When is the last time you took an in-person test where the proctor told you if you were right/wrong after every question?

You need to grade current knowledge. In fact, I would have removed the multiple choice altogether for this test and had a input(type=text)

EDIT: OK, ok: I get it, it is just a fun website, not the BAR.


I'm trying to look at weird javascript, not haze new employees.


> 25. - "" + + "1" * null - [,]

> Output: 0

> You answered: I give up

> You answered incorrectly.

No I didn’t.

(Not my fault there were two potentially correct answers!)


You're weird if you write code like this.


The sad part is that you don't have to and still encounter one of these cases.

The reason is that JS is dynamically typed and thus functions are free to return multiple result types. So you might run into one of these and get unexpected results because at some point in your code one of the functions returned an empty array instead of a number or a boolean, or divided by zero, or...

It's good to at least be aware of these edge cases so you can avoid them.


I love that 1/0 = infinity in JS. It's as pragmatic as a tired student doing their algebra homework at 1AM :))


Like others have already noted; 1/0 == inf is something about JS, it is a consequence of JS defaulting to IEEE754 floating point representation for all numbers. Any language that supports IEEE754 FP will have the same result.


There are some real problems with the quirks of Javascript too even if the examples are not that common. Some JS coders for instance use the oneof pattern and if you then forget somewhere to check if you got a Number instead of a Boolean you can end up with something weird. I have seen it several times.


JS is a loose type language by definition.

Biggest problem for JS, from the beginning was people start writing without completely understanding it. Even myself seeing my initial JS code feel embarrassed. Noone understand what is or handle loose typing, prototype, closures, functional concepts like currying before coding. I took almost an year to understand same. Although that time there are hardly my resources like now. But I still see, people directly jumping into JS.


The awful letterspacing makes it very hard to know whether strings are empty or single spaces.

(They’re all empty strings).


Yeah, got a couple of questions wrong due to that.


Every language has undefined or unexpected behaviors.

The real lesson is that this doesn't necessarily matter, if they're not essential to the code you write most of the time.

Also some of those about floating point math, which is not JS being weird, but rather about needing to know what a float is.


And now imagine, if you had to deal with a programming language, which does not warn you or error out, when you are using null or undefined bindings in procedures, where you should be using strings or other things ... Wait, people actually invented such a language ...


That was the most horrible quizz I've ever taken in my life.

I recently tried to make a simple flask app to sort pictures on my computer. I used file:/// since there were a lot of them.

I was quite unhappy to discover that CORS is quite restrictive...


This covers most of the weirdness with primitive types, but then we have an oddly designed OO system on top of that: prototypes + `this` and constructors + classes bolted on.


It may seem odd, but it got clear to me once I understood the basic idea.

In a language like Java you have objects and classes at the bottom of the language. How would you add maps to the language? Well, you create a Map class, implement put, get, has methods, etc.

Now imagine a language where you have functions and maps as your basic primitives, but you don't have classes and objects. How do you add objects and classes to the language? You can do something like this: objects are maps, information about their basic class is saved in 'prototype' property, constructor is a function that returns an object, etc. That's how JavaScript works. You can read about this in more details here https://exploringjs.com/impatient-js/ch_proto-chains-classes... , if you keep the basic idea in mind the details are easy to follow.


Buggy in Firefox (89): gets stuck on the second question.

Works fine in Chrome.


Works well on Firefox 89.0.2 on Windows 10 :-/


It worked fine for me (Firefox 89 on openSuse Tumbleweed). Maybe some add-on is interfering?


Same for me in macOS with firefox 89. Disabled all addons and same problem.


Here too (89.0.2, macOS).


Works fine in Firefox on Android.


I guess this is why people seem to like Typescript so much.


TypeScript to the rescue! I just prefer strongly typed languages, but it's things like this that disappear (for the most part) when you have types. I personally can't keep types in my head as well as some, so when the language forces me to, it helps a lot. Vanilla JS just makes me feel like I'm walking on egg shells.

Then we can get into a lot of fun when we get to type systems like SML and Rust where you start to really touch on some power (but can become intimidating). I'm still a bit weary of trying to hop into Haskell.


I just wish TypeScript supported optional named parameters. Once I got used to them in C#, it's hard to live without. They make adding enhancements so much easier. The work-arounds, such as object literals, are just not the same.


I did incredibly badly. JS really is weird.


silly questions, some of these expressions you don't use on a daily basis. language has quirks, but allows you to do a lot in a much expressive manner.


Could [,,,] be used to optimize an array allocation? Say I'll need an array of fixed size. Could this method be used to help the interpreter to allocate an array more efficiently?


“11” + 1 = “111”

“11” - 1 = 10

(ノಠ益ಠ)ノ


I mean to be fair how would one subtract a string?

What would "abc" - "def" mean? Would "abcd" - "bcd" = 0? So it's already an absurd thing to do


TIL I dont know js.


> JavaScript is a great programming language, but ...

Why are we so afraid to call trash "trash"? It's not attacking the creator, but just how can we make progress if things are not perceived as they are?

https://wtfjs.com


Calling it "trash" is a very inflammatory way to describe one the most popular programming languages in the world. Language like that is against the HN guidelines, because it distracts from your argument to debate about whether JS is "trash" or not.

JavaScript has a bunch of quirks, but it's still great. In the StackOverflow survey, JS reliably ranks highly in the list of "most loved" languages. https://insights.stackoverflow.com/survey/2020#technology-mo...

Node.js is a thing because people liked JS so much that they wanted to use it on the server side. You may think they're all fools, but it's actually pretty nice. I like the latest version of Node better than working in Python (mostly because node_modules are better than virtualenvs, IMO.)

You can use TypeScript to address many of the quirks described in TFA, but you can also reliably avoid them just by never using == and always preferring ===, using String() before using + to concatenate, and using Number() before subtracting.

I basically never encounter situations like the ones depicted in TFA in real code, because I never try to add two arrays or subtract two strings or what have you.


Trash

“Admitting like this has a bunch of quirks, but it’s still great.” (not arguing for it though)

Criticism is a driving force behind change to the better. Calling something “great” without pointing out great sides I find… pointless and harmful.

To name a few great points, distinct from other languages:

JS has a pretty shameless object model, where you (usually) have ordered keys, and every key is a property with an {enumerable, writable, configurable, get, set} descriptor. It is much more usable and practical than in almost all other languages. (They do that for “efficiency”, but that’s non-sequitur)

JS has very useful destructuring syntax, which many languages lack of and that leads to assignment bloating and boredom. Also, jit optimizes out temporary objects, so foo({x, y}) / function foo({x, y}) works almost at the same speed as foo(x, y).

JS has a nice Function type, which allows easier metaprogramming and substitution of ‘this’ object. Functions are objects, so you can function foo() {}; foo.x = 1; console.log(foo.name, foo.x).

JS bare objects may have methods and properties: t={_x:1, get_x(){return this._x}, …}

JS doesn’t treat syntactic lists in a last-value-only way, so you can expect foo(…a, …b, c) to work, and to work as intended. E.g. Lua cannot do that, and python (afair) requires you to collect items into a single array (not sure, correct me if I’m wrong).

But what people usually mean by “great” is “it generally works”. I know it is an american thing, but some aspects of js, including those you meet everyday, are really just trashy. It would be nice to not have them at all.


Why is it trash? Or is this just like PHP where people are so accustomed to shitting on something that they're completely oblivious to the progress that's actually been made?

I don't understand what this actually contributes to the discussion beyond it being a tired, beaten-down programming meme.


Yes, it's a pity that java applets have disappeared. Great UX, great development cycle, programming purity... </s>


It is 'trash' for the general purpose 'one-size-fits-all' programming language that it's fans want to impose upon the world. (I shake my head at microcontrollers running dynamic languages. "micro-python" = spare me. line breaks over a TTY? there's the primary question of on-chip resources and a giant shim between the physical realities of a microcontroller and some high-level fantasies about what programming "should be", absent any information on the low level specifics = the issue here.)

It's ES3 API is missing a few things, like Object.keys, etc. I still happily code for browsers in ES5 with a few polyfill functions.

JS works fine for it's core competency: manipulating DOM elements and local data representations.

It's not JS's fault that the browser is the way it is and that HTTP is the way it is and that using a remote directory "browsing" protocol via a specialized file browser that renders hypertext isn't a 2-way bound GUI framework...


That’s a load of gibberish. JS as a language has nothing to do with browser APIs, why would you judge it based on that? Also, ES6 was ratified six years ago.

Seems like you had a bad experience with espruino / jerryscript or something and are projecting based on that?

Dynamic languages are easier to program in. That’s a fact, and why they are so popular.


Although I like Javascript (Typescript) a lot, I have to bite:

> Dynamic languages are easier to program in.

Only if you don't care about writing correct and maintainable programs. The only thing that dynamic languages make easier is writing code. Or, more specifically, the first couple of versions of it. That's not what typical programming as a process mostly consists of.


By the definition of formally correct, yes, you’re right. But dynamic programming still allows programs to be functionally correct, and verified by testing, which is more often than not good enough.


Writing and maintaining a program in a statically typed language is significantly easier and faster than doing the same in a dynamic language while manually writing all the tests that would cover the same level of correctness that types give you out of the box.


gibberish? "JS as a language has nothing to do with browser APIs, why would you judge it based on that?" Excuse me? That is precisely what JS is based upon. Lemme give you a prime example: JS = a single threaded event loop, ON PURPOSE. That directly affected Node.js as an implementation.

Your browser has an embedded scripting engine. Embedded, meaning, the source code for your browser includes the scripting language engine, and JS can run on that and script various permitted things via the browser API.

ES6 may have been ratified six years ago, but it's not the target code your front-end build chain spits out, is it... browsers don't uniformly support it yet. fun facts.

Look, I've been programming JS since it was invented. I'm not impressed by "classes" that don't exist, arrow functions, async/await, futures, promises, fibers, and assorted hacks that don't mirror the actual CODE (what computers execute) a JS engine actually is based upon. I don't need these tools, why should I use something that I need to transpile when I can code directly for browsers as they are in 2021, including legacy browsers? I don't find ES6 "easier" in any way..

Regarding dynamic languages on microcontrollers: Dynamic languages cannot directly control memory allocation and manipulation, typically are heap based and have no concept of a stack frame, and are basically just a computer program written whose corresponding code instructions (machine code) and execution path is scripted by your high-level language. Read a JS engines source code, study embedded C, and get back to me as to how suitable you find it for timing deterministic embedded programming. LOL

Types are directly related to memory size allocations.

Study the recent crop of LLVM languages, including some very interesting ones like LuaJIT, which should be right up your alley, and note the role Garbage Collection (check the Boehm implementation, for example) plays in many of the object-oriented languages, as well as Go.

Look at the actual assembler instructions you require a computer to perform as a result of your high level specification.

I am NOT speaking gibberish.

I agree that dynamic languages are considered easier to program in.

When speaking of programming languages, they are not all on the same level. One cannot say "oh assembler is fine and all but i prefer lua", as it's like comparing apples and atoms.

there's a reason that you can implement a lisp in c, but that the contrary is not viable nor makes any sense. (although metaprogramming c with lisp makes a lot of sense :D)


And you boil all that down to 'JS is trash'?

> JS = a single threaded event loop, ON PURPOSE. That directly affected Node.js as an implementation.

That's not a criticism. It's not different to desktop app dev where you try to keep your processing away from the main thread, only it provides an easier interface through async programming. Synchronous = main thread; async = other thread.

Amazingly intuitive model.

> ES6 may have been ratified six years ago, but it's not the target code your front-end build chain spits out

C++14 may have been ratified 7 years ago but it's not the target code your build chain spits out

> Look, I've been programming JS since it was invented...

You're free to write pure assembly, and you don't.

> Regarding dynamic languages on microcontrollers

Python in particular has made this kind of programming mainstream

> Read a JS engines source code, study embedded C, and get back to me as to how suitable you find it for timing deterministic embedded programming. LOL

You're gatekeeping, and that's also your ego. You need to work on that.

> there's a reason that you can implement a lisp in c, but that the contrary is not viable nor makes any sense.

What!? I can build C in Lisp as much as I can build any other language. I parse the syntax, create machine code, and output a binary. How the hell do you think languages are built? How do you think C was built?

Just stop, man. It's not gibberish but it's bullshit.


Hey so can I write a device driver in JavaScript and is it a good language for that? How about my dma controller and python? What am I reading here. Not a comp sci major eh? Pass me something by reference in JavaScript, please


Anything is a “good language for that” if it meets your needs. If you can run the js VM in 128KB of RAM and have decent performance, why not? I guess you would be ok with a Lisp in a microcontroller even though it would also run a VM?

Do you think Arduino would have had any success at all if you had to write C or Assembly to use it?


What's your point, exactly? You can write a driver in practically any language.

> Not a comp sci major eh?

I guess you're a comp sci major.


I was a comp sci major.

Do you know what a Pointer is?

Do you know what passing by reference means?

Do you know what a heap allocation is vs pushing something into the stack?

Do you know what code and bss segments are and what they are for?

This is not about being right or wrong, I just can’t stand to watch total nonsense go unchallenged.

No, you cannot write a hardware device driver in a dynamic language, which is not to preclude code generation approaches, but to point out what low level hardware programming actually consists of: manipulating memory, registers included


1) I did not originate this "is trash" phrase, but replied to a parent comment you had rather excoriate. I said that Javascript is fine for what it was designed for: a single threaded UI event loop in a web browser. It is NOT good as a high-volume web server, for example. Http has a request and response cycle and is deliberately stateless such that all data goes out of scope and the entire thing can be deconstructed and, poof, no memory overhead nor leaks. It was designed that way, as a protocol.

2) I saw your comment below. You cannot write device drivers in the language of your choice on any of todays popular operating systems, nor on any embedded devices. Device drivers must have low level access to things like memory locations and cpu registers. Such things are not exposed to Javascript and not available. This is not even speaking about performance and garbage collection etc.

Look, Javascript is someones computer program. It can be implemented in very little code: here's an example: https://github.com/cesanta/v7

Languages are not all equal nor do they all function in the same way, and that's not my opinion.

Javascript syntax itself is one thing, and you can certainly feel free to Javascriptify some C++ libraries and make it all look a certain way for specific tasks, while managing things behind the scenes, up to a point... but there is no getting around the fact that SOMEONE and some languages are needed to implement low level systems functionality.

the power of Cython or the Python C FFI is that it allows you to script/glue modular native code.

You then state "C++14 may have been ratified 7 years ago but it's not the target code your build chain spits out"

no, a C++ COMPILER spits out assembler code that then gets assembled and linked into an executable.

The C++ or C code corresponds directly to a given set of assembler instructions which correspond directly to CPU instructions.

You claim that Python programming of microcontrollers is mainstream, but this is not true nor possible. Python SCRIPTING of code modules (that cannot be written in Python) is certainly one way to assemble a system from pre-built legos.

If you refer to knowing what I'm talking about as gatekeeping and egoism, might I suggest that you insist less forcefully in the correctness of incorrect things you state? we could be done with this spat in short order if YOU would refrain from speaking falsehoods. lies.untrue things.

I look forward to your lisp c compiler. make sure that it's 100% lisp from the bottom up, or I'll consider you're having ceded my point. Consider that the lisp you author in has a garbage collection system that lisp cannot have written originally, nor has any semantics for the underlying memory structures of, but hey, I guess if one is committed to pretending that all languages are equal for all tasks, who am I to question ones self-identification with a given language.


> It is NOT good as a high-volume web server, for example

You couldn’t have picked a worse example - your CS major seems to be needing a refresh. NodeJS came to be precisely because V8, single threaded and using an event loop, was GREAT at a high volume web servers. It massively reduced the overhead vs multi-process or thread based web servers and absolutely dominated performance benchmarks and concurrency. We started playing with 1M concurrent connections while you might barely get 100 on Apache a few years earlier. There were other async servers at the time (Tornado, Puma, Netty..) but the async-by-default ecosystem in node was a unique advantage.

Fast forward to today, it’s not an accident that the majority of high-performance web servers now are asynchronous and/or using cooperative multitasking (or even libuv directly, a spin-off of nodejs development): VertX, Actix, h2o, Jetty, go with goroutines, etc. It’s a much more efficient model.


1) you confuse me with a CS major on another thread. pick the right feud please.

2) "you couldn't have picked a worse example" HAAAA! You are wrong. wrong wrong. A) Do you even know what a memory leak IS and why/how it occurs? B) Did you understand what I said regarding stack allocation and out-of-scope equals "memory gone" and how that fundamentally differs from heap-based allocation as ALL JS OBJECTS ARE!??? you are wasting my time, friend.

you are straight up copying marketing lines from node.js with no apparent understand of what a single threaded event loop even is! (it is a gui system. almost all windows-style apps since 199whatever have an event loops as ONE of it's threads. Open X-code or Visual Studio and make a generic desktop app, and add a button and make it's click handler call a method of something or other. Check a moderately complex audio visual production application for clues on just how many threads one uses and for what purposes are they separate threads, etc.)

You then mention VERTX AND GOROUTINES RIGHT AFTER EXTOLLING THE VIRTUES OF A SINGLE THREADED EVENT LOOP. GO LOOK UP WHAT THOSE 2 SPECIFIC TECHNOLOGIES USE AND DO, VERSUS A SINGLE THREADED EVENT LOOP.

Do note why single core limits exist for single hardware thread execution worlds and stop saying that interleaving tasks on a single execution core is superior to architecting synchronized activity across all cores, and do note that you are primarily referring to CRUD web-dev while positing very erm controversial positions on programming language use-cases.

Node JS is a terrible high-volume web server as all of the alleged virtues you extol are workarounds from the single threaded execution model of what was never designed to be a server language. I will say it again: HTTP is a stateless protocol involving a request (method call) and response (what it returns) it then goes out of scope and disappears. no memory leaks. no heap memory allocation. no malloc. no new, etc. it's just poof gone. got it? this was done on purpose by smart computer people. If you want to run each request as a separate OS process, don't blame HTTP, but the primitives were designed for efficiency.

please DO look up what heap vs stack allocation is. please DO look up what a register vm vs a stack vm is. please DO not confuse scripting or glue code with machine code, as the machine code of your javascript program is precisely the javascript runtime with it's execution paths being puppeted by your script language. you are literally pushing someone elses buttons and calling it computer programming. no offense, that's why it's a high level language not a low one.

when you resort to reductio ad absurdum suggesting assembler as the tool for all programming, you are not engaging with anything I have said or written here, as tools have purposes, not everything is a hammer/nail

in a way, you are not wrong, as a macro-assembler, or C or C++ or Rust or Zig or Nim etc (non garbage collected compiled language capable of outputting machine code as the final target) is the next level up from asm.

Why do you suppose that Chrome is not written in JS but rather largely in C++?


Wait, where do memory leaks come in and what the hell fo they have to do with the subject at hand? I was talking about it’s fit for web servers, not embedded development. I’m disputing very specific comments you have made. You completely ignored what I said, and now accuse me of “copying marketing lines” and not knowing what an event loop is? Really? Is this how you win arguments?

I don’t know what you’re excited about yelling in caps - VertX uses and event loop. Goroutines are cooperative scheduling. Yes, they can also coordinate over multiple threads (guess what, node can too) but the underlying architecture for processing requests is the same.

> Node JS is a terrible high-volume web server

Again, why would that be? Node was invented precisely to be a high performance server. It’s literally it’s purpose, and it delivers. Check out any benchmarks like TechEmpower and guess which platforms you’ll find near the top. I’m not saying it’s the best choice but just stating facts. Nothing else to say here - you obviously have strong opinions but zero hands-on knowledge on this area.


> That directly affected Node.js as an implementation.

It didn’t “affect” node, it was the whole reason it came to exist: its cooperative multitasking was a good way to tackle concurrency (remember c10k), and V8 was available and easily embeddable. Without the event loop there would have been no reason to choose JS.


Though there are C compilers and also C code generators written in Lisp.


Loved this video (4mins) of a talk by Gary Bernhardt (CodeMash 2012) on the topic of JS' quirks:

https://www.destroyallsoftware.com/talks/wat


On Twitter he was also talking a lot about his journey into Typescript which I very much enjoyed. Great guy


Yep, the "wat" talk helped, got 17/25 first try since I recalled some of the weirdness.


> the "wat" talk

so that's what it's called.


The vast majority of these are implicit type coercion issues. Which is fairly easily banned with a linter. Once you've done that, I think JavaScript is probably the nicest of the big dynamic scripting languages (Perl, PHP, Python, Ruby) to work with.

There could definitely be improvements, but trash seems to be taking things too far.


The article only really touches on weird language caveats, but where JS really gets weird is it's ecosystem. If you are working with JS in a commercial capacity you will eventually find dependency hell, arbitrary toolchain complexity, asynchronous and callback mind bending, transpiling and compilation, and all just to render some HTML. Quick, get your compsci degree bobby, we need to make this button go to another page!


> I think JavaScript is probably the nicest of the big dynamic scripting languages (Perl, PHP, Python, Ruby) to work with.

As an experienced Python programmer learning JavaScript, this isn’t true for me yet but I hope it becomes true. I think JS will be much more useful to me than Python.

> Which is fairly easily banned with a linter.

Any particular recommendations?


Eslint is the default linter. Standard is a curated list of rules with good quality.

https://github.com/standard/eslint-config-standard

I would start with that and tweak what you don't like


> Any particular recommendations?

TypeScript.


Trash is something you are discarding. If you are keeping it because it still has some utility to you, then it is by definition not trash.


So I think we should interpret "trash" here as "badly designed".


That's only one of multiple equally valid definitions.


Because some developers will take offense. They will think that if you call the language they use trash then you are calling them a trash developer.

It's odd, there are developers who make the language they work part of their identity and see attacks on that language as attacks on their identity.

I don't get it, but it is what it is. If you call javascript trash, you will offend people.


We are not saying this because we are afraid to say it, but because it is not true.

JS is a great programming language with a lot of quirks and footguns, most of them easily avoidable by enforcing good coding styles, using easily available and widespread tooling.

And most language have linters or compilers that issue coding-style related warnings, including C, so this is not specific to Javascript.


Why is JS a great programming language?


There are a lot of things to hate about JS: its questionable choices around type coercion, its Date object, for..in on arrays (indices are string) (a consequence of every object key is a string), for..in on objects which also iterate on inherited properties, the fact everything is dynamic, its weak standard library, null vs undefined, many original things….

There are also great things about it: the ease at which one can write in a functional style code, anonymous functions (vs Python), closures (vs Java), immutable strings, the fact that there is not a shitload of classes to instantiate to achieve anything (vs Java), block-scoped variable definitions, sane default function parameter handling (vs Python).

Javascript engines are also very fast nowadays because of the ton of engineering going into them.

I'd say, above all arguments, you can rapidly build very fast, lightweight and program programs with it.

Sure, you can pull a shitload of pointless dependencies and write bloatware that re-render the world on each keypress, and that's what many people do. And to make things worse, they wrap their shit with an entire browser that uses a lot of ram and processing power. But you are not forced to do that. You can also write efficient server code that does not show up in top, has few dependencies if at all, consume a negligible amount of memory and run without failure for months. That's possible too. Frontend-wise, you have amazing frameworks like Svelte that do wonders and would allow you to build very lightweight apps rapidly.

That said I would often pick TypeScript so your program is statically type-checked and your code is documented through types, especially for more-than-one-person / non-trivial projects.

I also use and like other languages like Python and D (and a bit of Rust) that have good stuff which JS doesn't have (list comprehension, borrowing, traits, named parameter, sane coercion to False…), or bad stuff which JS has. Javascript is not always the best answer, or even a good answers, but it can be one.


If we want to perceive JS as it is we need to follow its history, understand the basic ideas behind different features of the language, understand the trade-offs behind various design decisions, etc.

Basically, it was meant to be a dialect of Scheme, but with map and array data structures at the bottom, as opposed to lists (hello, Clojure!). Suddenly, the last-minute decision was made to add Java-like syntax and OOP to this language and rebrand it as 'JavaScript'. Brendan Eich did the best he could in the limited timeframe to implement this. Then in 2015 'design-by-committee' approach was accepted and horrible feature creep started. We can then investigate every feature and how it pushed individual company's agendas while compromising the integrity and vision of the language, but that's a typical design-by-committee story.


I wouldn't say it's trash. JavaScript has many ugly quirks, but it has many good facets too, and it can be be used to write effective and fairly elegant code. Considering how quickly it was thrown together, and that's it's been evolving since then, I'd say it's a surprisingly good language.

> how can we make progress if things are not perceived as they are?

People aren't blind to JavaScript's failings. The evolution of JavaScript has been a mix of adding new features and fixing what they got wrong before, e.g. the way it has two different isNaN functions. [0]

[0] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...


Is javascript any more "trash" than python or ruby? I don't think so.


Presuming server-side JS, as you won't run Ruby or Python in the browser, typically, you can say that all 3 dynamic languages, as such (garbage collected, heap-based, untyped) will waste computer resources and perform worse than typed compiled languages (even garbage-collected ones) due to memory usage patterns.

Python has an interesting C FFI interface, among other approaches, that can at least allow CPU and memory-bound tasks to be accomplished inside of a native code module. You see a lot of domain specific work in Python due to these approaches, cython, etc..

Crystal lang is what I would urge all Ruby devs to look at. (an LLVM-based compiled language that has HTTP in it's standard library, for one thing...I'd pit it against Go, for example.)

As I see large front-end teams frequently pushing JS from a typescript base lately, I'm not sure that even the JS community supports "everything JS" anymore. At the point one does backend Node.js work with typescript or other more rigidly typed systems, I have to wonder if targeting V8 is really the desired option any more, and if the programmer would not be better off switching to a high-performance and feature-complete typed backend language system. (the single-threaded JS execution model being a needless constraint at this point, for example. It's great as an event loop)


> As I see large front-end teams frequently pushing JS from a typescript base lately, I'm not sure that even the JS community supports "everything JS" anymore.

Given how close typescript is to javascript, wouldn't it be more appropriate to treat it as just a dialect of javascript (like coffescript was) rather than a different language like python or go would be?


In the sense that there's extra meta information (type) it can be used as a frontend for other things than JS, so I could easily see

1) TS to WEBASM (once JS is out of the picture, might be awhile, lol...)

2) TS as an LLVM IR frontend (compiled TS on the server)

which is basically predicated upon industry familiarity as it's selling point, so yeah, I could see where it's also just a JS industry side-effect. A wart remover, lol


From what I read on forums, every single piece of technology that's ever been invented is trash according to some self-appointed expert. I doubt I could name a programming language used by more than 100 people about which at least one person hasn't written a crtical comment.


Elixir?

I've never used it but reading about it I kept going ohh, ahhh and wow! In stead of my standard: ** ** *k * ** *sh?!?


https://news.ycombinator.com/item?id=25787548

I've never used Elixir so can't say whether the criticisms are justified.


the only trash is that comment LooL


I thought the same thing. I think the author is afraid of offending people. Downvote culture has conditioned a lot of people to walk on eggshells and write this way.


Funny how this comment got downvoted into Valhalla in a matter of minutes.


> JavaScript is a great programming language

I have no idea how you could come up with this sentence after this brilliant display of what an insanely stupid language JavaScript is.


To be fair, just because a language allows you to do stupid things, doesn't mean that it's bad.

C, C++, Perl (oh, Perl!) allow you to do weird things, too, so do lots and lots of other languages.

A certain knowledge of this weirdness is very helpful, though, in avoiding it.


In fact Perl 5 has less weird coercion rules than JavaScript.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: