20
Should You Use JavaScript Maps And Sets?
JavaScript sets and maps have been around for a few years now, but I still get a lot of questions about them. My students wonder if they should be substituting traditional objects and arrays with these new data types. While there are some killer use cases for sets and maps, you should really look at them like specialized tools and not Swiss army knives.
A Set is a collection, like an array, except each value must be unique. They’re like what would happen if objects and arrays had a baby. Here’s a crash course:
const mySet = new Set();
mySet.add(1); // add item
mySet.add('a');
mySet.size; // 2
mySet.delete(1)
mySet.has(1); // false
mySet.clear(); // empties set
This is probably the only time I’ve actually seen sets used in the wild. It’s a handy one liner:
const arr = [1,2,3,4,4,5,6,7,7,7]
const unique = [...new Set(arr)]
// unique equals [1,2,3,4,5,6,7]
If you’re trying to solve literal set problems, sets are obviously the go to. You can see on the set docs how to implement the basic set operations. This stuff will probably come up when doing algorithm challenges, so it’s worth it to take a look.
Maps are honestly something that I personally thought would take over the landscape. But, then when you get down to it, they’re not as much of an upgrade as you might think. They are another way to store key/value data, but they’re more purpose-driven than objects, so they have some added benefits. Here’s a crash course:
const myMap = new Map();
myMap.set('a', 1);
myMap.set('b', 2);
myMap.set('c', 3);
myMap.get('a'); // 1
myMap.set('a', 'ok');
myMap.get('a'); // 'ok'
myMap.size; // 3
myMap.delete('b'); // removes b key/value
myMap.clear() // empties map
This is the most obvious benefit to maps, they can take pretty much anything as key. Objects will always stringify anything used as a key. For instance, if you tried to use unique objects as object keys themselves, they would all get stringified into '[object Object]'
and overwrite each other. Luckily with maps, that’s not a problem! Each object would function perfectly well as a unique key. And if you used the same object key with a new value, it would overwrite the original value, just like you’d expect. It’s not a super common scenario, but it’s a good trick to know.
Technically, objects do kind of maintain insertion order with modern JS. HOWEVER there seem to be some caveats to maintaining key order. If you really need to be sure that your key/value pairs maintain their order for iteration, use a map.
It varies by browser, but for Chrome browsers, Maps could hold 16 million entries, while objects could only hold 11 million. So … this is technically a thing, I just doubt you’ll ever hit it (if you do, please tell me what data you were wrangling).
If you thought those seem like pretty specialized situations, you’d be right. I bet you that there are edge cases that I missed (please post below if you know of any), but those situations above are the good stuff. Which means that you’re safe to keep using regular objects a vast majority of the time. If you really want to use maps and sets for readability or something, go for it. Just don’t feel like you’re missing out if you don’t.
JSON can’t encode maps and sets properly yet, so that’s something that you may want to consider if you’re dealing with APIs. You’ll need to convert each map or set back into a plain old JS object first. I’m sure this will change in the future.
Also you might hear that some people say that Maps are worth it because they are iterable by default. However, since we’re already using ES2015+ syntax to get maps, then we’ll also have the Object.keys
, values
, and entries
iterating functions as well. Which, sort of steals that thunder a bit. And to my knowledge, maps don’t have any iteration speed bonus. Which leads me to my last point.
…Maybe? On MDN’s maps page it says that they, “Perform better in scenarios involving frequent additions and removals of key-value pairs.” However, I personally haven’t seen that to be the case in my tests or research. It is true that the Map.delete
function is faster than the object delete
keyword, but there’s a catch. The Map.set
function is slower than the internal object’s set
method, so whatever bonus you get from faster deletes would get a huge chunk taken out by slower inserts. Also, certain browsers implement things differently, which means it’s not a consistent boost. In my, albeit limited, testing, I found objects were always faster, but not by much.
As for sets, there can be no debate that set.has
is faster than array.includes
(that’s O(N) for arrays vs. O(1) for sets). Unfortunately, set.add
seems much slower than arr.push
. So, if a list were big enough where searching was costly, the process of creating the set in the first place would be so slow that any speed boosts from repeated searching would be lost. I think if you were searching hundreds or thousands of times on a list with a ton of items, then sets might be worth it.
I’d always recommend actually performance testing your application before swapping anything. Just making up isolated JSPerf tests can't tell you nearly as much as actually timing your application. I don’t think there are any blanket cases where maps or sets have an edge with performance. I’m 100% sure there are edge cases, but those would need to be discovered after unique investigations. So far, it just seems like JS hasn’t prioritized performance with these new data types.
With the exception of those specialized use cases, there isn’t much of a reason to use the new data types. They’re new tools to add to your toolbox, not replacements.
happy coding everyone,
mike
20