1

I have php and nodejs installed in my server. I am calling a nodejs code which uses node canvas via php

Like:

<?php 
    exec(node path/to/the/js/file);
?>

Problem:

Each process of this execution consumes around 250 Mb of memory because of node canvas. So If my server has around 8Gb of memory, only 32 users can use the service at any given point of time and also there is a risk of crashing the server if the number of users exceeds.

Is there any scalable solution to this problem?

UPDATE I have to use Canvas server side because of my business requirements, so I am using node canvas, but it is heavily consuming the memory which is giving a huge problem.

Abhinav
  • 8,028
  • 12
  • 48
  • 89
  • Bit broad isn't it? We don't know your requirements, nor do we know what the use case is to work out what might be best to suggest as a relevant solution? – Jonnix Apr 03 '17 at 10:35
  • Limit number of requests in the httpd of choice? – Alex Blex Apr 03 '17 at 10:42
  • @JonStirling : hi, I have updated the question, we have already implemented a business logic which uses node canvas – Abhinav Apr 03 '17 at 11:04

1 Answers1

1

Your problem is that you start a new node.js process for each request, that is why the memory footprint is so huge, and isn't what node.js is built for.
But node.js is built to handle a lot of different request in only one process, use that to your advantage.

What I advice you to do is to have only one node.js process started, and find another way to communicate between your PHP process and the node.js process.

There is a lot of different ways to do that, some more perfomant than others, some harder to build than other. All have pros and cons, but since both are web related language, you can be sure there is support in both for HTTP request.

So what you should do is a basic node.js/Express server, probably with only one API point, which execute the code you already did, and return the result. It is easy enought to do (especially if you use JSON to communicate between them), and while I don't know PHP, I m pretty sure it is easy to send a HTTP request and interpret the answer.

If you are ready to dig in node.js, you could try sockets or MQ, which should be more performant.

That way, you only have one node.js process, which shouldn't grow in memory and handle a lot more client, will not have to use exec, and have a first try with Express.

DrakaSAN
  • 7,673
  • 7
  • 52
  • 94
  • hi, thanx for the suggestion, i am actually a bit confused about what you are saying, even if one process handles multiple requests? will all the requests run simultaneously? cos if it runs simultaneously then it would definitely consume memory rite? sorry if I sound a bit naive, I dont have much idea about nodejs. – Abhinav Apr 03 '17 at 13:31
  • It would probably be simpler to discuss on the [node.js chat room](https://chat.stackoverflow.com/rooms/642/node-js), but to resume: Node js, unlike PHP, handle multiples request in one process, and is good at doing so. Yes and no, it will depend on if you use asynchronous programming, and even so will not be true concurrency, but from a human POV, yes it will run simultaneously. Compared to 1process/request, it will take less memory. – DrakaSAN Apr 03 '17 at 13:40
  • What you should do is go and look at node.js and what drives it, you will understand why one process is enought. PHP and node.js are really two different beast and work in radically different ways. – DrakaSAN Apr 03 '17 at 13:43