The Tilde In JavaScript
Here’s a little tidbit I learned recently: the tilde “~” is an operator in JS. Who knew? It turns out a lot of people, actually, but since I didn’t find out until way late in the game I thought I would help spread the word.
What does it do?
In JS the tilde performs a bitwise-not. For any given bit 1 or 0, the tilde will transform it to the opposite bit 0 or 1. JS doesn’t have a binary type though*, so this definition isn’t all that useful. When dealing with good ol’ decimal numbers the effect of a bitwise-not can be expressed with the simple formula -1 × (N + 1). If you’re curious why exactly this holds true, MDN has a great explanation here. In a nutshell it’s because JS encodes negative integers using two’s complement. But regardless of how it works, let’s see the effect of the tilde in the wild. * interesting nugget: JavaScript does have hex & octal types
So why should I care?
At first glance this might seem pretty useless. Well, to a certain extent it is… most JS developers get along just fine without the tilde. Joe Zim has documented a pretty nifty application for it though.
JS functions that normally return tuple indices often use -1 to indicate an error, since 0 is a valid index. This poses a slight problem because -1 is truthy, meaning that it will evaluate to true when converted to a boolean. You’re probably familiar with the following gotcha when using String.indexOf:
When I first learned about this I simply accepted all of those “!== -1” checks as a necessary evil. Okay, they’re really not all that bad. Still, the tilde can help out here a bit. If you remember our formula from earlier, ~-1 is 0 (which is falsey). Conversely ~<anything else> is non-zero (which is truthy). Armed with this knowledge we can refactor those “!== -1” checks like so:
Cool, huh? It may not be a game-changer, but it is a nice little trick to clean up your code.