E.g. I have such documents in the collection:
{
"key": "key1",
"time": 1000,
"values": [] // this one is optional
}
I need to update the collection from, let's say, CSV file by modifying or removing values
column and where key
& time
are filters .
What I've tried so far:
- DeleteMany(with
or(and(key: key1), and(time: time2))
, ... 276k moreor
arguments) + InsertMany with 276k documents => ~ 90 seconds - Bulk ReplaceOne with (
filter: and(key: key1, time: time2)
) => ~ 40 seconds - Split huge bulk into several smaller batches (7500 seems to be the most performant), but this one is not atomic in terms of db operation => ~ 35 seconds
Notes:
- All tests were with
bulk.ordered = false
to improve performance. - There is unique index
key: 1, time: -1
Is there a possibility to optimize such kind of request? I know Mongo can burst to ~80k inserts/s, but what about replacements?