I have a JavaScript heap out of memory in my Node.js application. I'm trying to insert 408 000 data in MongoDB with one call. I have two loop. The first loop goes from 1 to 24 and the second (inside the first loop) from 1 to 17 000. These data are the result of a NetCDF file. I'm parsing data from this file, I'm building the Model Object and I insert these data in MongoDB.
I see some posts on StackOverflow about this problem then I see than I can increase the node memory with --max_old_space_size
. But I don't know if it's the good way. Maybe you have some suggestions to optimize my code ?
Here is my loops:
for (var time_pos = 0; time_pos < 24; time_pos++) {
// This array contains 17 000 data
var dataSliced = file.root.variables['pm10_conc'].readSlice(
time_pos, time_size,
level_pos, level_size,
lat_from, lat_size,
lng_from, lng_size
);
// Loop : 0 to 17 000
for (var i = 0; i < dataSliced.length; i++) {
var pollution = new Pollution();
latitude = current_lat;
longitude = currrent_lng;
country = country_name;
model = model_name;
data_type = type_name;
level = 0;
datetime = date;
pollutants.pm10.description = description;
pollutants.pm10.units = units;
pollutants.pm10.concentration = dataSliced[i];
pollution.save(function(err){
if (err) throw err;
console.log("Data saved");
})
}
}
And here is my error:
<--- Last few GCs --->
56782 ms: Mark-sweep 1366.6 (1436.9) -> 1366.6 (1436.9) MB, 1943.5 / 0.0 ms [allocation failure] [GC in old space requested].
58617 ms: Mark-sweep 1366.6 (1436.9) -> 1366.6 (1436.9) MB, 1834.9 / 0.0 ms [allocation failure] [GC in old space requested].
60731 ms: Mark-sweep 1366.6 (1436.9) -> 1368.6 (1417.9) MB, 2114.3 / 0.0 ms [last resort gc].
62707 ms: Mark-sweep 1368.6 (1417.9) -> 1370.7 (1417.9) MB, 1975.8 / 0.0 ms [last resort gc].
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 0x3a7c3fbcfb51 <JS Object>
1: fnWrapper [/var/www/html/Project/node_modules/hooks-fixed/hooks.js:185] [pc=0x6ccee7825d4] (this=0x3a7c3fbe6119 <JS Global Object>)
2: fn [/var/www/html/Project/node_modules/mongoose/lib/schema.js:~250] [pc=0x6ccee7d8ffe] (this=0xd29dd7fea11 <a model with map 0x994a88e5849>,next=0x1cbe49858589 <JS Function fnWrapper (SharedFunctionInfo 0x3d8ecc066811)>,done=0x1cbe498586...
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
1: node::Abort() [node]
2: 0x1098b2c [node]
3: v8::Utils::ReportApiFailure(char const*, char const*) [node]
4: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [node]
5: v8::internal::Factory::NewTransitionArray(int) [node]
6: v8::internal::TransitionArray::Insert(v8::internal::Handle<v8::internal::Map>, v8::internal::Handle<v8::internal::Name>, v8::internal::Handle<v8::internal::Map>, v8::internal::SimpleTransitionFlag) [node]
7: v8::internal::Map::CopyReplaceDescriptors(v8::internal::Handle<v8::internal::Map>, v8::internal::Handle<v8::internal::DescriptorArray>, v8::internal::Handle<v8::internal::LayoutDescriptor>, v8::internal::TransitionFlag, v8::internal::MaybeHandle<v8::internal::Name>, char const*, v8::internal::SimpleTransitionFlag) [node]
8: v8::internal::Map::CopyAddDescriptor(v8::internal::Handle<v8::internal::Map>, v8::internal::Descriptor*, v8::internal::TransitionFlag) [node]
9: v8::internal::Map::CopyWithField(v8::internal::Handle<v8::internal::Map>, v8::internal::Handle<v8::internal::Name>, v8::internal::Handle<v8::internal::FieldType>, v8::internal::PropertyAttributes, v8::internal::Representation, v8::internal::TransitionFlag) [node]
10: v8::internal::Map::TransitionToDataProperty(v8::internal::Handle<v8::internal::Map>, v8::internal::Handle<v8::internal::Name>, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyAttributes, v8::internal::Object::StoreFromKeyed) [node]
11: v8::internal::LookupIterator::PrepareTransitionToDataProperty(v8::internal::Handle<v8::internal::JSObject>, v8::internal::Handle<v8::internal::Object>, v8::internal::PropertyAttributes, v8::internal::Object::StoreFromKeyed) [node]
12: v8::internal::StoreIC::LookupForWrite(v8::internal::LookupIterator*, v8::internal::Handle<v8::internal::Object>, v8::internal::Object::StoreFromKeyed) [node]
13: v8::internal::StoreIC::UpdateCaches(v8::internal::LookupIterator*, v8::internal::Handle<v8::internal::Object>, v8::internal::Object::StoreFromKeyed) [node]
14: v8::internal::StoreIC::Store(v8::internal::Handle<v8::internal::Object>, v8::internal::Handle<v8::internal::Name>, v8::internal::Handle<v8::internal::Object>, v8::internal::Object::StoreFromKeyed) [node]
15: v8::internal::Runtime_StoreIC_Miss(int, v8::internal::Object**, v8::internal::Isolate*) [node]
16: 0x6ccee4092a7
Aborted
[nodemon] app crashed - waiting for file changes before starting...
Do you know if there is a way to optimize my code or if increase node memory is the best way ?
EDIT
I've a worked solution. I tried to use mongoose insertMany()
but I have again the fatal error allocation failed.
Then I removed the new Pollution
and push my data in an array. After that I'm using collection.insert
and async each
like this :
var pollution = [];
for (var time_pos = 0; time_pos < 24; time_pos++) {
// This array contains 17 000 data
var dataSliced = file.root.variables['pm10_conc'].readSlice(
time_pos, time_size,
level_pos, level_size,
lat_from, lat_size,
lng_from, lng_size
);
async.each(dataSliced, function (item, next){
pollution.push({
'longitude' :current_lat,
'latitude' :current_lng,
'country' :country_name,
'model' :model_name,
'data_type' :type_name",
'level' :0,
'datetime' : date,
'pollution': {
'pm10': {
'description': description,
'units': units,
'concentration': item
}
}
});
}
}
Pollution.collection.insert(pollution, function(err, docs){
if (err) throw err;
console.log("Data saved");
});
If you have a better solution you can post your answer.