merge chained filters
tl;dr: Refactor sample code like [].filter((v) => v >= 0.4).filter((v) => v <= 0.6)
to [].filter((v) => v >= 0.4 && v <= 0.6)
. This way, code will iterate on the array just once, rather than n times (once for each filter
call).
const tmp = [];
// Generates random values.
for (let i = 0; i < 1e5; ++i) {
tmp.push(Math.random());
}
console.log(tmp);
// Benchmarks chained filters.
const t0 = performance.now();
tmp.filter((num) => num >= 0.4).filter((num) => num <= 0.6);
const t1 = performance.now();
console.log("chained filters", t1 - t0, "ms");
// Benchmarks unchained filters.
const t2 = performance.now();
tmp.filter((num) => num >= 0.4 && num <= 0.6);
const t3 = performance.now();
console.log("unchained filter", t3 - t2, "ms");
[0.6026284906105901, 0.805413781674541, 0.35849118962592796, 0.09264987817392267, ...]
chained filters 8.400000005960464 ms
unchained filter 3.8000000044703484 ms
As you can see, the unchained filter is faster than the chained filters by about (8.4 - 3.8) / 8.4 * 100 = 55%
.