0

I currently have a problem generating two thousand pdf's using node.js and bull.

Indeed, I manage to generate about 400 pdf. Then I got this error message:

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory

This is the bull queue and the function that generate the pdf files:

let myFirstQueue = new Bull(
      "firsttdssss",
      {
        redis: { port: 6379 },
      },
      {
        defaultJobOptions: { removeOnComplete: true, removeOnFail: true },
      }
    );

myFirstQueue.setMaxListeners(30);

for (let i = 0; i < 500; i++) {
  await myFirstQueue.add(data[i]);
}

let { generatePdf } = await import("~/lib/survey");

myFirstQueue.process(async job => {
  let filename = kebabCase(
    `${campaign.title} - ${job.data.employee.fullName.toLowerCase()} - ${moment().format("DD/MM/YYYY")} `
  );
  return generatePdf(campaign.template.body, job.data, filename, 210, 297);
});

myFirstQueue.on("progress", () => {
  console.log("progresds");
});

myFirstQueue.on("completed", () => {
  console.log("completed");
});

I have already tried to increase the node.js memory leak using :$env:NODE_OPTIONS="--max-old-space-size=8192"

SLSofiane
  • 49
  • 5
  • Do them one after the other, instead of 2000 at once? Also, you can't use `await` inside `map()`, it doesn't work (doesn't await). Everybody does that, but it doesn't work. Use a `for` loop instead. Besides, your `map()` doesn't even return anything so it's the wrong method for two reasons. – Jeremy Thille Mar 11 '22 at 16:02
  • Do you think this can be resolved the memory problem ? thank you – SLSofiane Mar 11 '22 at 16:06
  • Well if your system runs out of memory after 400 concurrently, yes doing them one after the other will solve the problem :) If you have a costly operation, don't run 2000 of them in parallel, that's just common sense. To speed things up you can try something in beetween, like batches of 10 or 20 at a time, but that's a bit tricky to implement – Jeremy Thille Mar 11 '22 at 16:07
  • Do you know if others solutions exists ? Do you know if is possible to generate two thousand pdf files using node.js ? – SLSofiane Mar 11 '22 at 16:09
  • Do you know where i can show a code demo using batches of 10 or 20, ? – SLSofiane Mar 11 '22 at 16:15
  • I have the same error using the for loop ... – SLSofiane Mar 11 '22 at 16:19
  • FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory – SLSofiane Mar 11 '22 at 16:19
  • In this case, please update your question and provide the new code with the for loop. In theory, if you generate the PDF one by one, there's no reason to hit a memory limit – Jeremy Thille Mar 12 '22 at 16:35
  • I have edit the question.. – SLSofiane Mar 12 '22 at 18:12
  • The same error ... – SLSofiane Mar 13 '22 at 09:36
  • Another problem is, I see you are using `survey-pdf`, but I could find nothing in its documentation to wait for the PDF file to be written to the disk. It's got `surveyPDF.save()` and that's it. No callback function, no Promise mode. Still, it's an asynchronous procedure, so if you launch 2000 of them, they'll be done concurrently. Maybe it's worth switching to a library that can be awaited? – Jeremy Thille Mar 14 '22 at 11:28

1 Answers1

1

I would do this with two application. Application one would be responsible to add jobs in the queue

Application two would be a worker which will be fetching jobs one by one and then generating pdf..

Finally by using pm2 cluster I will increase the no of workers so that more jobs can be complete in parallel