2

I have a table that contains over 100,000 records. Server: node.js/express.js. DB: mongo On the client, a table with a pager is implemented. 10 records are requested each time.

When there were 10,000 records, of course, everything worked faster, but now there was a problem with speed and not only.

My aggregation:

import { concat } from 'lodash';
...
let query = [{$match: {}}];
query = concat(query, [{$sort : {createdAt: -1}}]);
query = concat(query, [
    {$skip : (pageNum - 1) * perPage}, // 0
    {$limit : perPage}                 // 10
]);
return User.aggregate(query)
           .collation({locale: 'en', strength: 2})
           .then((users) => ...;

2 cases:

  1. first fetch very slow

  2. when I click to last page I got error:

    MongoError: Sort exceeded memory limit of 104857600 bytes, but did not opt in to external sorting. Aborting operation. Pass allowDiskUse:true to opt in.

Please, tell me, I am building the aggregation incorrectly, or is there a problem with memory on the server as the error says and additional nginx settings are needed (another person is engaged in this) or is the problem complex, or perhaps something else altogether?

Added:

I noticed that the index is not used when sorting, although it should be used?

enter image description here

aggregation to execute console.log =>

[
 {
      "$match": {}
 },
 {
      "$lookup": {
           ...
      }
 },
 {
      "$project": {
           ...,
           createdAt: 1,
           ...
      }
 },
 {
      "$match": {}
 },
 {
      "$sort": {
           "createdAt": -1
      }
 },
 {
      "$skip": 0
 },
 {
      "$limit": 10
 }
]

Thanks for any answers and sorry my English :)

Roman Nozhenko
  • 698
  • 8
  • 21
  • https://stackoverflow.com/questions/26375017/mongo-error-when-using-aggregation-sort-exceeded-memory-limit , also is good to set some initial filters on what you search and use indexes ... – R2D2 Jan 26 '21 at 11:24
  • The fact of the matter is that I need all records from the table in parts of 10, which means that none of the many indexes participate in this query. First 10, next query => next 10 items – Roman Nozhenko Jan 27 '21 at 06:09
  • *field "createdAt" for sort operation was indexed – Roman Nozhenko Jan 27 '21 at 07:27

1 Answers1

2

It does say that you've memory limit, which makes sense, considering that you're trying to filter through 100,000 requests. I'd try using return User.aggregate(query, { allowDiskUse: true }) //etc, and see if that helps your issue.

Whilst this isn't the documentation on the Node.js driver specifically, this link summaries what the allowDiskUse option does (or in short, it allows MongoDB to go past the 100MB memory limit, and uses your system storage to temporarily store some information while it performs the query).

NeuronButter
  • 709
  • 1
  • 8
  • 24
  • Thank you, I completely forgot about it. However, I think this will only solve the memory problem when sorting a large amount of data. But what to do with the speed of the server's response, because I simply request the first 10 records after sorting with $ match = {}, that is, no indexes are used. I wanted to know if there are any methods to speed up such requests? – Roman Nozhenko Jan 27 '21 at 07:01
  • 1
    And I have a slightly different syntax: User.aggregate(query).allowDiskUse(true).collation(...).then(...) – Roman Nozhenko Jan 27 '21 at 07:03