0

We've recently crossed 130K documents in one of our collection. Since then we're facing higher memory consumption issue with nodejs. We're using sails waterline.js orm for querying mongodb. So any call made to db through waterline api for example Model.create triggers the increment and node process keeps consuming ram until ~1.8GB then it blows up and restarts. I am trying to debug issue for past week. And I could not find any solution. Please help.

When I deleted all collection data the server does not show any memory consumption. But bringing back the 130K docs creates the issue again.

For ex - I have a user registration endpoint /user
It calls following models in row

let user = await User.create(data);
let model2 = await Model2.create(userdata);
let model3 = await Model3.create(model2Data)
let model4 = await Model4.create(data2);

Take a note all these models does not have too many data. The model which has 130K data is different model.

I took heap dump of previous and later state of node vm.screentshot Examining in chrome dev tools I find there are lots of db data loaded into memory (underlined in image. Those data belongs to different model/collection called estimates) but our endpoint /user never calls or interact those models. So I suppose it's waterline or something else.

Shahid Kamal
  • 380
  • 2
  • 14
  • 1
    If you have any findings of where memory is being consumed then you should include those in your question. One of those cases where screenshots from a profiler would be acceptable, as long as they are legible. I'm not saying I would discount that waterline has anything to do with this, but it would not hurt to try any narrow down if there was any particular process you were doing that might be the cause instead. If you really want to narrow down "is it waterline", then I suggest creating a small listing that does little else the the suspected culprit operation and profile that. – Neil Lunn Mar 27 '19 at 08:17
  • I added the screenshot of profile with the data that should not be present. To narrow it down I made one call to endpoint (/user) and one by one removed calls to other models such as `Model2.create()` from the flow of registration. At the end I had just `User.create()` nothing else but still I saw memory leak. How do I create smart listing thing you mentioned? Can you give an example I didn't get that part. Thanks – Shahid Kamal Mar 27 '19 at 08:47
  • @NeilLunn Thanks for comment. I think I got the problem. It was waterline association. Models were misconfigured by previous dev – Shahid Kamal Mar 27 '19 at 14:04

1 Answers1

0

So there was a waterline mapping to User collection and other collections including the one which had so many data. Removing redundant associations between User and other collections fixed the memory leak.

Shahid Kamal
  • 380
  • 2
  • 14