1

I'm using a webworker to calculate coordinates and values belonging to those places. The calculations happen in the background perfectly, keeping the DOM responsive. However, when I send the data from the webworker back to the main thread the DOM becomes unresponsive for a part of the transfer time.

My webworker (sending part):

//calculates happen before; this is the final step to give the calculated data back to the mainthread.
var processProgressGEO = {'cmd':'geoReport', 'name': 'starting transfer to main', 'current': c, 'total': polys}
postMessage(processProgressGEO);
postMessage({
  'cmd':'heatmapCompleted',
  'heatdata': rehashedMap,
  'heatdatacount': p,
  'current': c,
  'total': polys,
  'heatmapPeak': peakHM,
});
self.close();

The variable rehashedMap in the code snippet above is an object with numerical keys. Each key contains an array with another object in.

My mainthread (only the relevant part:)

var heatMaxforAuto = 1000000;  //maximum amount of datapoints allowed in the texdata. This takes into account the spread of a singel datapoint.
async function fetchHeatData(){
  return new Promise((resolve, reject) => {
    var numbercruncher = new Worker('calculator.js');
    console.log("Performing Second XHR request:");
    var url2 = 'backend.php?datarequest=geodata'
    $.ajax({
      type: "GET",
      url: url2,
    }).then(async function(RAWGEOdata) {
      data.georaw = RAWGEOdata;
      numbercruncher.onmessage = async function(e){
        var w = (e.data.current/e.data.total)*100+'%';
        if (e.data.cmd === 'geoReport'){
          console.log("HEAT: ", e.data.name, end(),'Sec.' );
        }else if (e.data.cmd === 'heatmapCompleted') {
          console.log("received Full heatmap data: "+end());
          data.heatmap = e.data.heatdata;
          console.log("heatData transfered", end());
          data.heatmapMaxValue = e.data.heatmapPeak;
          data.pointsInHeatmap = e.data.heatdatacount;
          console.log("killing worker");
          numbercruncher.terminate();
          resolve(1);
        }else{
          throw "Unexpected command received by worker: "+ e.data.cmd;
        }
      }
      console.log('send to worker')
      numbercruncher.postMessage({'mode':'geo', 'data':data});
    }).catch(function(error) {
      reject(0);
      throw error;
    })
  });
}

async function makemap(){
  let heatDone = false;
      if (data.texdatapoints<= heatMaxforAuto){
      heatDone = await fetchHeatData();
    }else{
      var manualHeatMapFetcher = document.createElement("BUTTON");
      var manualHeatMapFetcherText = document.createTextNode('Fetch records');
      manualHeatMapFetcher.appendChild(manualHeatMapFetcherText);
      manualHeatMapFetcher.id='manualHeatTriggerButton';
      manualHeatMapFetcher.addEventListener("click", async function(){
        $(this).toggleClass('hidden');
        heatDone = await fetchHeatData();
        console.log(heatDone, 'allIsDone', end());
      });
      document.getElementById("toggleIDheatmap").appendChild(manualHeatMapFetcher);
    }


}

makemap();

The call to the end() function is needed to calculate the seconds since the start of the webworker. It returns the difference between a global set starttime and the time of calling.

What shows in my console:

HEAT:  starting transfer to main 35 Sec.   animator.js:44:19
received Full heatmap data: 51             animator.js:47:19
heatData transfered 51                     animator.js:49:19
killing worker                             animator.js:52:19

1 allIsDone 51

The issue: My DOM freezes between the start of the data transfer and the message after receiving the full heatmap data. This is the phase between the first and second message in my console. It takes 16 seconds to transfer, but the DOM only goes unresponsive once for a part of that time. Webworkers can't share data with the mainthread, so a transfer is needed.

Question: Firstly, how to prevent the freeze of the the DOM during the onmessage phase of the webworker? Secondly, more out of curiosity: how can this freeze only occur during a part of that phase, as these are triggered by two consecutive steps with nothing going on in between?

What I tried so far:

  1. Doing a for-loop over the rehashedMap and return key by key. This still triggers DOM freezes; shorter, but more than once. In rare occurrences it takes the tab down.
  2. Looking for a way to buffer the onmessage phase; however, there's no such option specified in the documentation (https://developer.mozilla.org/en-US/docs/Web/API/Worker/onmessage) as compared to the postMessage phase (https://developer.mozilla.org/en-US/docs/Web/API/Worker/postMessage). Am I missing something here?
  3. As a test I replaced the rehashedMap with an empty object; this didn't cause any freezes in the DOM. Of course, this is leaves me without access to the calculate data.
  4. I looked at this thread on SO:Javascript WebWorker - Async/Await But I'm not sure how to compare that context to mine.
Clueless_captain
  • 420
  • 2
  • 13
  • 1
    You've used the term "JSON object" in several places above where I'm certain you just mean "object." JSON is a *textual notation* for data exchange. [(More here.)](http://stackoverflow.com/a/2904181/157247) If you're dealing with JavaScript source code, and not dealing with a *string*, you're not dealing with JSON. – T.J. Crowder Nov 18 '20 at 10:10
  • 1
    I can now understand how that's confusing in explaining the problem. Thanks for pointing it out. I'll make the changes where needed. – Clueless_captain Nov 18 '20 at 10:40
  • Do you experience this on different browsers? Do you have circular references in this data? How deep does this object go? What does the performance tab of your dev tools say is the bottleneck. Anyhow and unrelated, don't kill your worker if you are going to start a new one from the same script: starting a worker is an huge work for the browser, keeping it waiting for an event is nothing. – Kaiido Nov 18 '20 at 23:28
  • Please include the information you gave on the answer below as an [edit] to your question, and include a [MCVE] as has been requested. Also, do you really need to have all this data on your front thread? I guess you can't show all of it at the same time, so isn't there a way to split this request so that your worker generates only what's really needed? (if you could show a bit of your visu script that could also help). – Kaiido Nov 20 '20 at 05:08

1 Answers1

2

Options

It's understandable you should associate this with the web worker, but it probably doesn't have anything to do with it. I was wrong, it does. I see two possible reasons for the problem:

  1. (We know this is not true for the OP, but may still be relevant for others.) The problem is probably that you have a lot of DOM manipulation to do once you've received the heat map. If you do that in a tight loop that never lets the main thread do anything else, the page will be unresponsive during that time.

    If that's what's going on, you have to either find a way to do the DOM manipulation more quickly (sometimes that's possible, other times not) or find a way to carve it up into chunks and process each chunk separately, yielding back to the browser between chunks so that the browser can handle any pending UI work (including rendering the new elements).

    You haven't included the DOM work being done with the heat map so it's not really possible to give you code to solve the problem, but the "carving up" would be done by processing a subset of the data and then using setTimeout(fn, 0) (possibly combined with requestAnimationFrame to ensure that a repaint has occurred) to schedule continuing the work (using fn) after briefly yielding to the browser.

  2. If it's really the time spent transferring the data between the worker and the main thread, you might be able to use a transferable object for your heat map data rather than your current object, although doing so may require significantly changing your data structure. With a transferable object, you avoid copying the data from the worker to the main thread; instead, the worker transfers the actual memory to the main thread (the worker loses access to the transferable object, and the main thread gains access to it — all without copying it). For instance, the ArrayBuffer used by typed arrays (Int32Array, etc.) is transferable.

  3. If it's really the time spent receiving the data from the worker (and from your experiments it sounds like it is), and using a transferable isn't an option (for instance, because you need the data to be in a format that isn't compatible with a transferable), the only remaining option I can see is to have the worker send the main script smaller blocks of data spaced out enough for the main thread to remain responsive. (Perhaps even sending the data as it becomes available.)

Closer look at #3

You've described an array of 1,600 entries, where each entry is an array with between 0 and "well over 7,000" objects, each with three properties (with number values). That's over 5.6 million objects. It's no surprise that cloning that data takes a fair bit of time.

Here's an example of the problem you've described:

const workerCode = document.getElementById("worker").textContent;
const workerBlob = new Blob([workerCode], { type: "text/javascript" });
const workerUrl = (window.webkitURL || window.URL).createObjectURL(workerBlob);
const worker = new Worker(workerUrl);
worker.addEventListener("message", ({data}) => {
    if ((data && data.action) === "data") {
        console.log(Date.now(), `Received ${data.array.length} rows`);
        if (data.done) {
            stopSpinning();
        }
    }
});
document.getElementById("btn-go").addEventListener("click", () => {
    console.log(Date.now(), "requesting data");
    startSpinning();
    worker.postMessage({action: "go"});
});
const spinner = document.getElementById("spinner");
const states = [..."▁▂▃▄▅▆▇█▇▆▅▄▃▂▁"];
let stateIndex = 0;
let spinHandle = 0;
let maxDelay = 0;
let intervalStart = 0;
function startSpinning() {
    if (spinner) {
        cancelAnimationFrame(spinHandle);
        maxDelay = 0;
        queueUpdate();
    }
}
function queueUpdate() {
    intervalStart = Date.now();
    spinHandle = requestAnimationFrame(() => {
        updateMax();
        spinner.textContent = states[stateIndex];
        stateIndex = (stateIndex + 1) % states.length;
        if (spinHandle) {
            queueUpdate();
        }
    });
}
function stopSpinning() {
    updateMax();
    cancelAnimationFrame(spinHandle);
    spinHandle = 0;
    if (spinner) {
        spinner.textContent = "Done";
        console.log(`Max delay between frames: ${maxDelay}ms`);
    }
}
function updateMax() {
    if (intervalStart !== 0) {
        const elapsed = Date.now() - intervalStart;
        if (elapsed > maxDelay) {
            maxDelay = elapsed;
        }
    }
}
<div>(Look in the real browser console.)</div>
<input type="button" id="btn-go" value="Go">
<div id="spinner"></div>
<script type="worker" id="worker">
const r = Math.random;
self.addEventListener("message", ({data}) => {
    if ((data && data.action) === "go") {
        console.log(Date.now(), "building data");
        const array = Array.from({length: 1600}, () =>
            Array.from({length: Math.floor(r() * 7000)}, () => ({lat: r(), lng: r(), value: r()}))
        );
        console.log(Date.now(), "data built");
        console.log(Date.now(), "sending data");
        postMessage({
            action: "data",
            array,
            done: true
        });
        console.log(Date.now(), "data sent");
    }
});
</script>

Here's an example of the worker sending the data in chunks as fast as it can but in separate messages. It makes the page responsive (though still jittery) when receiving the data:

const workerCode = document.getElementById("worker").textContent;
const workerBlob = new Blob([workerCode], { type: "text/javascript" });
const workerUrl = (window.webkitURL || window.URL).createObjectURL(workerBlob);
const worker = new Worker(workerUrl);
let array = null;
let clockTimeStart = 0;
worker.addEventListener("message", ({data}) => {
    if ((data && data.action) === "data") {
        if (clockTimeStart === 0) {
            clockTimeStart = Date.now();
            console.log(Date.now(), "Receiving data");
        }
        array.push(...data.array);
        if (data.done) {
            console.log(Date.now(), `Received ${array.length} row(s) in total, clock time to receive data: ${Date.now() - clockTimeStart}ms`);
            stopSpinning();
        }
    }
});
document.getElementById("btn-go").addEventListener("click", () => {
    console.log(Date.now(), "requesting data");
    array = [];
    clockTimeStart = 0;
    startSpinning();
    worker.postMessage({action: "go"});
});
const spinner = document.getElementById("spinner");
const states = [..."▁▂▃▄▅▆▇█▇▆▅▄▃▂▁"];
let stateIndex = 0;
let spinHandle = 0;
let maxDelay = 0;
let intervalStart = 0;
function startSpinning() {
    if (spinner) {
        cancelAnimationFrame(spinHandle);
        maxDelay = 0;
        queueUpdate();
    }
}
function queueUpdate() {
    intervalStart = Date.now();
    spinHandle = requestAnimationFrame(() => {
        updateMax();
        spinner.textContent = states[stateIndex];
        stateIndex = (stateIndex + 1) % states.length;
        if (spinHandle) {
            queueUpdate();
        }
    });
}
function stopSpinning() {
    updateMax();
    cancelAnimationFrame(spinHandle);
    spinHandle = 0;
    if (spinner) {
        spinner.textContent = "Done";
        console.log(`Max delay between frames: ${maxDelay}ms`);
    }
}
function updateMax() {
    if (intervalStart !== 0) {
        const elapsed = Date.now() - intervalStart;
        if (elapsed > maxDelay) {
            maxDelay = elapsed;
        }
    }
}
<div>(Look in the real browser console.)</div>
<input type="button" id="btn-go" value="Go">
<div id="spinner"></div>
<script type="worker" id="worker">
const r = Math.random;
self.addEventListener("message", ({data}) => {
    if ((data && data.action) === "go") {
        console.log(Date.now(), "building data");
        const array = Array.from({length: 1600}, () =>
            Array.from({length: Math.floor(r() * 7000)}, () => ({lat: r(), lng: r(), value: r()}))
        );
        console.log(Date.now(), "data built");
        const total = 1600;
        const chunks = 100;
        const perChunk = total / chunks;
        if (perChunk !== Math.floor(perChunk)) {
            throw new Error(`total = ${total}, chunks = ${chunks}, total / chunks has remainder`);
        }
        for (let n = 0; n < chunks; ++n) {
            postMessage({
                action: "data",
                array: array.slice(n * perChunk, (n + 1) * perChunk),
                done: n === chunks - 1
            });
        }
    }
});
</script>

Naturally it's a tradeoff. The total clock time spent receiving the data is longer the smaller the chunks; the smaller the chunks, the less jittery the page is. Here's really small chunks (sending each of the 1,600 arrays separately):

const workerCode = document.getElementById("worker").textContent;
const workerBlob = new Blob([workerCode], { type: "text/javascript" });
const workerUrl = (window.webkitURL || window.URL).createObjectURL(workerBlob);
const worker = new Worker(workerUrl);
let array = null;
let clockTimeStart = 0;
worker.addEventListener("message", ({data}) => {
    if ((data && data.action) === "data") {
        if (clockTimeStart === 0) {
            clockTimeStart = Date.now();
        }
        array.push(data.array);
        if (data.done) {
            console.log(`Received ${array.length} row(s) in total, clock time to receive data: ${Date.now() - clockTimeStart}ms`);
            stopSpinning();
        }
    }
});
document.getElementById("btn-go").addEventListener("click", () => {
    console.log(Date.now(), "requesting data");
    array = [];
    clockTimeStart = 0;
    startSpinning();
    worker.postMessage({action: "go"});
});
const spinner = document.getElementById("spinner");
const states = [..."▁▂▃▄▅▆▇█▇▆▅▄▃▂▁"];
let stateIndex = 0;
let spinHandle = 0;
let maxDelay = 0;
let intervalStart = 0;
function startSpinning() {
    if (spinner) {
        cancelAnimationFrame(spinHandle);
        maxDelay = 0;
        queueUpdate();
    }
}
function queueUpdate() {
    intervalStart = Date.now();
    spinHandle = requestAnimationFrame(() => {
        updateMax();
        spinner.textContent = states[stateIndex];
        stateIndex = (stateIndex + 1) % states.length;
        if (spinHandle) {
            queueUpdate();
        }
    });
}
function stopSpinning() {
    updateMax();
    cancelAnimationFrame(spinHandle);
    spinHandle = 0;
    if (spinner) {
        spinner.textContent = "Done";
        console.log(`Max delay between frames: ${maxDelay}ms`);
    }
}
function updateMax() {
    if (intervalStart !== 0) {
        const elapsed = Date.now() - intervalStart;
        if (elapsed > maxDelay) {
            maxDelay = elapsed;
        }
    }
}
<div>(Look in the real browser console.)</div>
<input type="button" id="btn-go" value="Go">
<div id="spinner"></div>
<script type="worker" id="worker">
const r = Math.random;
self.addEventListener("message", ({data}) => {
    if ((data && data.action) === "go") {
        console.log(Date.now(), "building data");
        const array = Array.from({length: 1600}, () =>
            Array.from({length: Math.floor(r() * 7000)}, () => ({lat: r(), lng: r(), value: r()}))
        );
        console.log(Date.now(), "data built");
        array.forEach((chunk, index) => {
            postMessage({
                action: "data",
                array: chunk,
                done: index === array.length - 1
            });
        });
    }
});
</script>

That's building all the data and then sending it, but if building the data times time, interspersing building and sending it may make the page responsiveness smoother, particularly if you can send the inner arrays in smaller pieces (as even sending ~7,000 objects still causes jitter, as we can see in the last example above).

Combining #2 and #3

Each entry in your main array is an array of objects with three numeric properties. We could instead send Float64Arrays with those values in lat/lng/value order, using the fact they're transferable:

const workerCode = document.getElementById("worker").textContent;
const workerBlob = new Blob([workerCode], { type: "text/javascript" });
const workerUrl = (window.webkitURL || window.URL).createObjectURL(workerBlob);
const worker = new Worker(workerUrl);
let array = null;
let clockTimeStart = 0;
worker.addEventListener("message", ({data}) => {
    if ((data && data.action) === "data") {
        if (clockTimeStart === 0) {
            clockTimeStart = Date.now();
        }
        const nums = data.array;
        let n = 0;
        const entry = [];
        while (n < nums.length) {
            entry.push({
                lat: nums[n++],
                lng: nums[n++],
                value: nums[n++]
            });
        }
        array.push(entry);
        if (data.done) {
            console.log(Date.now(), `Received ${array.length} row(s) in total, clock time to receive data: ${Date.now() - clockTimeStart}ms`);
            stopSpinning();
        }
    }
});
document.getElementById("btn-go").addEventListener("click", () => {
    console.log(Date.now(), "requesting data");
    array = [];
    clockTimeStart = 0;
    startSpinning();
    worker.postMessage({action: "go"});
});
const spinner = document.getElementById("spinner");
const states = [..."▁▂▃▄▅▆▇█▇▆▅▄▃▂▁"];
let stateIndex = 0;
let spinHandle = 0;
let maxDelay = 0;
let intervalStart = 0;
function startSpinning() {
    if (spinner) {
        cancelAnimationFrame(spinHandle);
        maxDelay = 0;
        queueUpdate();
    }
}
function queueUpdate() {
    intervalStart = Date.now();
    spinHandle = requestAnimationFrame(() => {
        updateMax();
        spinner.textContent = states[stateIndex];
        stateIndex = (stateIndex + 1) % states.length;
        if (spinHandle) {
            queueUpdate();
        }
    });
}
function stopSpinning() {
    updateMax();
    cancelAnimationFrame(spinHandle);
    spinHandle = 0;
    if (spinner) {
        spinner.textContent = "Done";
        console.log(`Max delay between frames: ${maxDelay}ms`);
    }
}
function updateMax() {
    if (intervalStart !== 0) {
        const elapsed = Date.now() - intervalStart;
        if (elapsed > maxDelay) {
            maxDelay = elapsed;
        }
    }
}
<div>(Look in the real browser console.)</div>
<input type="button" id="btn-go" value="Go">
<div id="spinner"></div>
<script type="worker" id="worker">
const r = Math.random;
self.addEventListener("message", ({data}) => {
    if ((data && data.action) === "go") {
        for (let n = 0; n < 1600; ++n) {
            const nums = Float64Array.from(
                {length: Math.floor(r() * 7000) * 3},
                () => r()
            );
            postMessage({
                action: "data",
                array: nums,
                done: n === 1600 - 1
            }, [nums.buffer]);
        }
    }
});
</script>

That dramatically reduces the clock time to receive the data, while keeping the UI fairly responsive.

T.J. Crowder
  • 1,031,962
  • 187
  • 1,923
  • 1,875
  • 1
    Hello @T.J. Crowder; thanks for thinking along with me. I've disabled all DOM operations (Progressbars and statistical report). I've also disabled the creation of the heatmap layer and the entire leaflet canvas running it. In another test I sent the heatmap to the mainthread as part of the `onmessage` call; but didn't assign the response to the main `data` object by commenting out: `data.heatmap = e.data.heatdata;`. In both tests my problem persisted, even after clearing cashes. Guess I'll have to figure out that transferable object; thanks for pointing out that option. – Clueless_captain Nov 18 '20 at 10:47
  • @Clueless_captain - I'm really surprised to hear that. But yes, a transferable should do it in that case, though it'll be a bit of a pain restructuring your data. – T.J. Crowder Nov 18 '20 at 10:59
  • Rebuilding the data from the transferable will most probably be even slower than the structured clone algo. One should definitely transfer transferables, but converting to transferables isn't IMM a very good advice. – Kaiido Nov 18 '20 at 23:19
  • @Kaiido - Who said anything about "rebuilding" the data? I said *change* the data structure, I didn't suggest using a transferable as a temporary copy or something like that. Then use the transferable object directly to perform the DOM updates. – T.J. Crowder Nov 19 '20 at 07:02
  • Probably easier said than done then. Not all data structures can be "changed" to a flat numerical array without some sort of "rebuilding" steps. – Kaiido Nov 19 '20 at 07:16
  • I don't think so. I see quite often people say "use transferables" as if it was magic panacea, but I'm still to see someone propose how to do this transformation in a meaningful way. *And your answer is basically built on such speculation.* – Kaiido Nov 19 '20 at 07:27
  • @Kaiido - Well, "people" may do that. I am not. I'm suggesting a practical solution to the problem the OP describes. If you have another suggestion, post an answer. (And no, it isn't, but I have no desire to discuss it with you further.) – T.J. Crowder Nov 19 '20 at 07:28
  • No I don't because we lack details from the OP I did asked for. In the absence of this information it's impossible to provide a "practical solution". – Kaiido Nov 19 '20 at 07:29
  • A little update so far. Reading up on the transferables left me with quite some head scratching for the time being so I started experimenting with the structure of the `heatdata` object. First I replaced all float values by integers, without any noticeable improvements. Then I simply omitted all values. So I ended up with an object of 1600 numerical keys, for each key there was an array of empty objects. Again, no improvement. I then used `JSON.stringify(heatdata)`. This added some time in the worker, but, transfer happened instantaneous. So it's the depth of the object causing a slowdown. – Clueless_captain Nov 19 '20 at 08:08
  • (To be added to my above comment). This however isn't a solution, as you need `JSON.parse()` in the main thread to convert it back to an object. This method can't be ran asynchronously, so you end up with a freeze there. I've been trying to understand more of the inner workings of `onmessage` and `postMessage` to figure out why it happens, but allas - no success there yet. This is a behaviour I can reproduce on any browser so far (Chrome, Edge and Firefox) – Clueless_captain Nov 19 '20 at 08:13
  • @Clueless_captain - How big is each of those 1600 arrays of empty objects? And is there any way you can encode this data into a single flat `Float64Array`? In fact, if you update your question with a [mcve] demonstrating the problem that people can copy and paste into a local setup to experiment with (including representative code for what you do with the data when you receive it), we may be able to help you better. – T.J. Crowder Nov 19 '20 at 08:44
  • Those 1600 keys represent a point in time. For each point in time there's data. That data is the array, a datapoint in that array is an `object` like this `{lat:51.123, long:23.123, value:0.7986}`. Coordinates are rounded to three points, value isn't. For each point in time dataset, the values are aggegated by the worker if the coordinates are the same. So that can't be reduced further. The length of those 1600 arrays depend on the data we have. It ranges from 0 to well over 7000. (at this point I'm only working with 1/4th of the total dataset), so that number's still on the low side. – Clueless_captain Nov 19 '20 at 09:01
  • Using a `Float64Array` wouldn't be an option I'm afraid, the arrays for each point in time are prepared in such a way that the heatmapplugin in Leaflet immediately can work with them. Having the mainscript reparse an array like that into an `object` that is understandable for the plugin would cause too much slowdowns - hence the reason this project started out with the need for a worker. I'll see if I can set up a codepen with some random data to reproduce the issue. – Clueless_captain Nov 19 '20 at 09:04
  • @Clueless_captain - That data is **massive**. :-) If I assume each entry in the 1600 element array is an array with an average of 3,500 objects in it, that's 5.6 **million** objects. But the data is also amenable to being broken into parts and sent in chunks, and the chunks are amenable to being carried by transferable objects. I've updated the answer with examples. Combining #2 and #3 seems to be quite effective. – T.J. Crowder Nov 19 '20 at 11:00
  • Me again, sorry if it bothers I'm really just trying to help have the best content here. So, transferring doesn't really offer a *great* improvement over cloning in terms of pure speed of messaging. Benchmarks [on same thread](https://jsfiddle.net/mdet4jg1/) and [through worker](https://jsfiddle.net/mdet4jg1/). In Firefox it's just a bit faster to transfer, in Chrome it's the same (clone even winning most of the times for me). Where it really has an impact is on memory: avoiding useless copies of *the same* data, we avoid GC to kick in. But if the buffer is passed only one time, no real win. – Kaiido Nov 20 '20 at 04:22
  • @Kaiido - That doesn't jibe with the examples in the answer above. There's a significant improvement in the clock time when using the transferable. Do you see why there's a discrepancy between your test and mine above? – T.J. Crowder Nov 20 '20 at 07:37
  • @T.J.Crowder switching on and off the transfer from the last snippet I see about the same results as in the benchmark (btw clockTimeStart should probably be reset in the click event) And probably the difference is because in the other snippets you were using plain Array vs TypedArray? – Kaiido Nov 20 '20 at 07:41
  • 2
    @Kaiido - Thanks. I'll have to take a look when I have time. Rushing on something else right now. But it could be that cloning a `Float64Array` is just a lot faster than cloning an array containing a bunch of objects, which would explain it...and why I thought that was down to using transferring. Which would give us a way forward for the OP: Use something primitive that's easily cloned (or transferred, really). – T.J. Crowder Nov 20 '20 at 07:44
  • 1
    @T.J.Crowder I implemented something based on your on 2nd and 3rd solution. From my first worker I sent the `heatmap` object with keys, but the entire content of those 1600 arrays as strings. When that's done, the user receives a notification and upon accepting the merge of that data to the original data a spinner shows up and a second worker starts parsing the strings to objects. I tested it with the entire dataset (that has gone over 20 million objects now) and it seems to work like a charm. – Clueless_captain Nov 20 '20 at 08:05
  • 1
    @Kaiido thanks for thinking with us, I think it has to do with the depth of an object. See my comment above this one. Transferring the initial `heatmap` from the first worker without converting it to a string takes forever. If I ran it on the full dataset it blocks the UI for a quarter of an hour or more. Splitting the processing of the `heatmap` in chunks as suggested by T.J. Crowder improved this significantly. Even when adding the time it takes for `JSON.parse()` and `JSON.stringify` the full dataset takes about 3 minutes to process, with only a few seconds blocking the UI (with a spinner). – Clueless_captain Nov 20 '20 at 08:10
  • @Clueless_captain - That's great to hear! I'll still update the answer since currently it attributes the success of the last option on transferables which may not be correct; from your results and the results when you turn off transferring in the last example above (currently) it looks like, as you say, it has more to do with depth/complexity than data copy time. Thanks for that info, and happy coding! – T.J. Crowder Nov 20 '20 at 11:49