0

I'm using Node-Addon-Api to create addons in NodeJS. every 30 second there is a call to a addon's function, every time the function is called 4 double arrays, with highest array size of 2400 values, are been created. I'm experiencing memory overload, so i was wondering it could be because on these 4 arrays. do I need to handle every object that been created with the function New()? like Napi::Array::New or Napi::Number::New ?

for example:

Napi::Array plotHistory_Napi_out = Napi::Array::New(env, 2400);

do i need to release this memory? if so how?
or it happens automatically when the scope is function ends? and nodeJS GC sees it and collects it?

i will appreciate if anyone can assist me, please?

  • Can you provide a larger sample of code that actually demonstrates your issue? N-API is honestly just a series of wrappers that allow callers to interact with Addons, so it's just a bunch of context-aware callbacks on wrappers that abstract to `node_value` to varying degrees of of absurdity depending on which part of the conversion you're at - `env` has to mean something. - Also, `2400` is not supposed to represent the *maximum* size of your data, it's the given length that it allocates. You need to infer the length from the `info.Env()` by its type - Which again, requires a better example. – syntaqx May 21 '21 at 10:12
  • The interfaces are really important to understanding callbacks tear down the environment, do whatever you wanted, and finalize it transformatively. If you want to get the values into your interface, you would be moving closer to `Napi::Object` or buffering the data accordingly. If the other way, then you would be rewrapping the `Object`s you converted. Without context, there's not a _ton_ of point to turning an existing `Napi::Array` into a `Napi::Array`, with 100% the same data, which is why it's a fairly strange to implement. The data. If you _do_ want that, that's what `ArrayBuffer` is for. – syntaqx May 21 '21 at 10:21

0 Answers0