3

Hello All I have a collection in mongoDB whoose size is 30K.
When I run the find Query (I am using mongoose) from Node server, following problems occur.
1: It takes long time to get result back from datatabase server
2: While creating JSON object from the result data, Node server get crashed

To solve the problem I tried to fetch the data in chunk (Stated in the Doc) Now i am getting the docuemnt one by one in my stream.on callback,.
here is my code

var index=1;    
var stream = MyModel.find().stream();
    
    stream.on('data', function (doc) {
      console.log("document number"+ index);
      index++;

    }).on('error', function (err) {
      // handle the error
    }).on('close', function () {
      // the stream is closed
    });

And the out put of my code is

Document number1 document number2 ...... documant number 30000.

Output shows that database is sending the document one by one.

Now my question is, Is there any way to fetch the data in the chunk of 5000 documents.
Or is there any better way to do the same??
Thanks in advance
I tried using batch_size() but it did not solve my problem
Can I use the same streaming for MAP reduce ?

Community
  • 1
  • 1
Vishu238
  • 673
  • 4
  • 17

0 Answers0