2

As a result of several hours of unfruitful searches, I am posting this question. I suppose it is a duplicate of this one: How do you run RServe on AWS Lambda with NodeJS? But since it seems that the author of that question did not accomplish his/her goal successfully, I am going to try again.

What I currently have:

A NodeJS server, that invokes an R script through Rserve and passes data to evaluate through node-rio.

Function responsible for that looks like this:

const R = (arg1, arg2) => {
        return new Promise((resolve, reject)=>{
          const args = {
            arg1, arg2
          };
          //send data to Rserve to evaluate
          rio.$e({
            filename: path.resolve('./r-scripts/street.R'),
            entrypoint: 'run',
            data: args,
          })
            .then((data)=>{
              resolve(JSON.parse(data));
            })
            .catch((err)=>{
              reject(`err: ${err}`);
            });
        });
    };

And this works just fine. I am sending data over to my R instance and getting results back into my server.

What I am ultimately trying to achieve:

Every request seems to spawn its own R workspace, which has a considerable memory overhead. Thus, serving even hundreds of concurrent requests using this approach is impossible, as my AWS EC2 runs out of memory pretty quickly. So, I am looking for a way to deploy all the memory intensive parts to AWS Lambda and thus get rid of the memory overhead.

I guess, the specific question in my case is if there is a way to package R and Rserve together with NodeJS lambda function. Or if there is a way for me to get convinced that this approach won't work using lambda and I should try to look for an alternative.

Note: I cannot use anything other than R, since these are external R scripts, that I have to invoke from my server.

Thanks in advance!

Dauta
  • 39
  • 1
  • 4

0 Answers0