Echo JS 0.11.0

<~>

tracker1 2864 days ago. link 1 point
I would say if your data set is hundreds thousands of records, JS and Node can be *very* bad at this... In general, when trying to map-reduce larger data sets in node, you need to really be careful on how you introduce, stream and dispatch results.

Node has a command line flag that can expose manual garbage collection (good to do after each record, or every N records, as memory use will grow fast otherwise.  Even then, when you have so many objects referenced, they will collapse...  You can use streams for the map, and reduce aggregates, but if your reduction aggregate object/map gets too big, it will blow up the runtime when you run out of references and/or memory available.

Realistically go and even rust are better suited to these kinds of flows with larger data.  I love JS, it's my favorite language, and I really like node... but there are some use cases it's absolutely bad for.