2

I've got a Google Cloud Function that is running out of memory even though it shouldn't need to.

The function compiles info from a number of spreadsheets, the spreadsheets are large but handled sequentially. Essentially the function does:

spreadsheets.forEach(spreadsheet => {
   const data = spreadsheet.loadData();
   mainSpreadsheet.saveData(data);
});

The data is discarded on each loop, so the garbage collector could clean up the memory, but in practice that doesn't seem to be happening and the process is crashing close to the end.

I can see from other answers it is possible to force garbage collection or even prevent node from over allocating memory

However, both of these involve command line arguments which I can't control with a cloud function. Is there any work around, or am I stuck with this as an issue when using Google Cloud Functions?

ChrisJ
  • 2,486
  • 21
  • 40

1 Answers1

2

A colleague tipped me off that changing the code to

spreadsheets.forEach(spreadsheet => {
   let data = spreadsheet.loadData();
   mainSpreadsheet.saveData(data);
   data = null;
});

Might be enough to tip the GC off that it can clean up that structure.

I was skeptical, but the function is now running to completion. Turns out you can hint to the GC in node

ChrisJ
  • 2,486
  • 21
  • 40