I have a table that contains over 100,000 records. Server: node.js/express.js. DB: mongo On the client, a table with a pager is implemented. 10 records are requested each time.
When there were 10,000 records, of course, everything worked faster, but now there was a problem with speed and not only.
My aggregation:
import { concat } from 'lodash';
...
let query = [{$match: {}}];
query = concat(query, [{$sort : {createdAt: -1}}]);
query = concat(query, [
{$skip : (pageNum - 1) * perPage}, // 0
{$limit : perPage} // 10
]);
return User.aggregate(query)
.collation({locale: 'en', strength: 2})
.then((users) => ...;
2 cases:
first fetch very slow
when I click to last page I got error:
MongoError: Sort exceeded memory limit of 104857600 bytes, but did not opt in to external sorting. Aborting operation. Pass allowDiskUse:true to opt in.
Please, tell me, I am building the aggregation incorrectly, or is there a problem with memory on the server as the error says and additional nginx settings are needed (another person is engaged in this) or is the problem complex, or perhaps something else altogether?
Added:
I noticed that the index is not used when sorting, although it should be used?
aggregation to execute console.log =>
[
{
"$match": {}
},
{
"$lookup": {
...
}
},
{
"$project": {
...,
createdAt: 1,
...
}
},
{
"$match": {}
},
{
"$sort": {
"createdAt": -1
}
},
{
"$skip": 0
},
{
"$limit": 10
}
]
Thanks for any answers and sorry my English :)