0

enter image description hereI have a web-server running on an embedded linux running on an armv7 SBC (Atmel SAMA5D27-SOM with 128MB RAM).

The firmware update process requires a .tar.gz file to be uploaded via this interface, then untarred/unzipped and further processes run on it.

After successful upload using busboy module, I noticed that the memory usage is still high (see attached image) even though the upload is streamed directly into a file (code below).

Further processes implemented with the child_processes after the file upload are then killed due to the system being out of memory.

I have implemented the --max-old-size-space-size flag to = 32MB as i've read this sets a lower threshold for when the GC starts.

I have tried various methods of using child_processes APIs including: 1. execSync 2. async exec (as described in node.js docs) 3. spawn with {detached: true}

/* imports omitted for brevity */

module.exports = async (req, res, next) => { // eslint-disable-line no-unused-vars
  try {

    /* File validation omitted for brevity */

    busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
      /* Further file validation omitted for brevity,
         File transfer debugging omitted for brevity */


      // Handle saving file stream to file
      const saveTo = path.join(updatesDir, path.basename(filename));
      file.pipe(fs.createWriteStream(saveTo));
    });

    busboy.on('finish', () => {
      return res.json({ updateFilename });
    });

    // This is required stream req progress back to browser
    return req.pipe(busboy);
  } catch (error) {
    console.error(error)
    next(error);
  }
}

I understand that node.js uses a garbage collector, however the memory taken for the file upload isn't freed for later processes to use.

Is there something wrong with how I am streaming the file upload to a file? Why is Node.js not releasing the memory? Is there a way to trigger GC manually in this instance?

wntwrk
  • 317
  • 1
  • 2
  • 10
  • 1
    I'm afraid you may be as low as node.js can actually get on memory usage. Even if you trigger GC a lot, `openssl` and others will take up a lot of memory. You can get node on a diet, but I do think you may get into this or that kind of trouble - anyway, see this issue [nodejs/node#2948](https://github.com/nodejs/node/issues/2948). Another option worth checking is [low.js](https://www.lowjs.org/) - a low memory node (but it's only es5 complatible)... – Michał Karpacki May 29 '19 at 18:33
  • @MichałKapracki Thanks for your comment. After some significant debugging the solution came in adding a large swap file to the system. The system image didn't include swap memory as default. This has resolved all OOM issues. – wntwrk May 30 '19 at 13:40
  • 1
    Well, if that works for you then why not... Please keep in mind that swapping is very slow and you will hit performance issues. I'd still pursue a low mem node version. – Michał Karpacki May 30 '19 at 19:29

1 Answers1

1

Further testing reveal that the embedded linux system didn't include swap memory by default. I created a swap file and made it the same size as the systems RAM (128MB). Later I increased this to give extra overhead due to extra space availble on the SD Card.

This allows linux to manage the OOM issue automatically by moving non-critical memory usage onto disc. This process did slow down the upload by an extra few seconds but it did stop all OOM memory issues.

wntwrk
  • 317
  • 1
  • 2
  • 10