Javascript's weird typecasting rules

Adrien Foucart, PhD in biomedical engineering.

Get sparse and irregular email updates by subscribing to https://adfoucart.substack.com. This website is guaranteed 100% human-written, ad-free and tracker-free.

Over on Instagram, I saw a post making fun of Javascript’s weird and unpredictable (unless you really know Javascript) typecasting behaviour. There is a very long tradition of mocking Javascript, with perhaps the most famous instance being the “Wat” video from Gary Bernhardt, complete with very dated early 2010s meme culture (I feel old).

Neither really take the time to explain what’s going on, so I thought it would be interesting to quickly go through the Instagram one, because I think it’s actually interesting to illustrate the difficulty of creating a programming language that intuitively “makes sense.” Something which Javascript spectacularly fails to do.

So, first one:

console.log(3 + 3);
6

This, obviously, is the expected behaviour. Nothing to see.

console.log("3" + "3");
33

This is logical as well, but it clearly shows the root of the issue with a “dynamic type” language. Because the + addition operator means “arithmetic addition” (for numbers), but also “concatenation” (for strings), the creators of the language have to make some choices as to what happens when you mix both types. Which is where it starts to become weird.

console.log("3" + + 3);
33

In this case, the + + is not actually needed, we could just use a single + and get the same result. It’s just put there to setup the next bit. But the reasoning here is simple: we start from a string, we have a + so we want to concatenate something. What is this thing? + 3, which is a number. But we cannot concatenate a number to a string, so it is first cast into the string 3, hence the final result: the string 33.

console.log(3 + + "3");
6

This is where it starts being a bit confusing, especially since doing 3 + "3" actually outputs 33. So the casting is done towards the string type for "3" + 3 and for 3 + "3", but the second + signs causes everything to fall back to numbers. Why? Because + in Javascript isn’t just for addition or concatenation, it can also be the unary plus operator, which converts whatever follows it to a number. And, because the unary operator has precedence over the addition operator, what Javascript evaluates is:

  1. a = + "3" = 3
  2. 3 + a = 3 + 3 = 6
console.log(3 + 3 - 3);
3

No problem here.

console.log("3" + "3" - "3");
30

What’s happening here ? The two operators, + (addition because there are things on both sides, not unary plus) and - have the same precedence, and arithmetic operators use “left-associativity” (except exponentiation, but let’s not go there), so this gets evaluated as ("3" + "3") - "3".

With "3" + "3", + will mean “concatenation,” so we get the string 33. Then, we subtract the string 3. But while addition applied to strings is concatenation, subtraction applied to string… doesn’t exist in javascript. The subtraction operator always try to “coerce both operands to numeric values” before performing the operation, so "33" - "3" is evaluated as 33 - 3 = 30.

How do other languages handle it ? Typically, most language prefer to throw an exception rather than to have unintuitive outputs. Python for instance, will refuse to do "3" + 3 or 3 + "3", throwing a TypeError in both cases. It does have its own weirdness, however, as "3" * 3 will work, and output the string '333': the multiplication operator, for strings, is interpreted as a “repeat” operator in Python.

So what’s the conclusion of all that ? Except for “don’t use Javascript unless you really, really have to,” it’s: programming languages are often weird, but in the end they are just following exactly their internal rules. I certainly prefer languages where those rules tend to produce the behaviour that I intuitively expect. But languages are designed according to their own needs. Javascript tries to keep going as long as possible without crashing, because the goal is for end users on their web browsers to not have any errors on their ends. “Forgiveness by Default,” because if it doesn’t, users are unhappy.

Also, developpers have been making fun of Javascript for so long that it would now feel weird if it became less weird.