0

I have a huge dataset in MongoDB, I had to save it by parts using create, in batches of say 1%. Now I want to read it back. I know that nodeJS can take the heat because I read it into a local variable before saving to mongo and send it to Angular. I get the error below, but I do believe that this erros is coming from MongoDB/Mongoose.

It would be nice if I could read data from Mongo by parts.

(node:1096) UnhandledPromiseRejectionWarning: RangeError [ERR_BUFFER_OUT_OF_BOUNDS]: Attempt to write outside buffer bounds

Background: I am using mongoose and populate, see here: Populate does not seem to populate for all the array elements

It works if I limit the populate number of documents.

here goes the schema I am trying to recover:

const FastaSchema = new mongoose.Schema({
  header: headerSchema,
  healthyTissue: [{ type: mongoose.Schema.Types.ObjectId, ref: "Hidden" }],

});

const hiddenSchema = new mongoose.Schema({
  children: [{ type: mongoose.Schema.Types.ObjectId, ref: "FastaElement" }]
});

const fastaElement = new mongoose.Schema({
  description: String,
  sequence: String,
  comments: String
});

Here goes the code that I am using to populate:

app.use("/", async (req, res) => {
  const aux = await Fasta.find({}, "healthyTissue").populate({
    //---------------------first level (healthy or tumor fasta file) ----------------
    path: "healthyTissue",
    perDocumentLimit: 1,
    //---------------------    second level (hidden documents)       ----------------
    populate: {
      path: "children",
      perDocumentLimit: 1,
    },
  });

  const aux2 = [];

  aux.forEach((element) =>
    element.healthyTissue.forEach((hidden) =>
      hidden.children.forEach((leaf) => aux2.push(leaf))
    )
  );

  res.json(aux2);

});

0 Answers0