Currently we are working with nodeJS & LokiJS. as our application is dealing with real-time data; to communicate with external NoSQL/Relational DB will cause the latency problems.
So we decided to use in-memory database i.e, LokiJS.
LokiJs is good at when we are working with a collection which has 500-100 documents in it. but when it comes to the updates & parallel reads; it is worse.
meaning one of our vendor is published Kafka endpoint to consume the feed, and serve it to some external service again; From Kafka topic we are getting 100-200 events per second. So whenever, we received an Kafka event, we are updating the existing collection. As the delta updates are too frequent, LokiJS collection updates are not done properly by that read giving inconsistency results.
Here is my collection creation snippet.
let db= new loki('c:\products.json', {
autoload: true,
autosave: true,
autosaveInterval: 4000
});
this.collection= db.addCollection('REALTIMEFEED', {
indices: ["id", "type"],
adaptiveBinaryIndices: true,
autoupdate: true,
clone: true
});
function update(db, collection, element, id) {
try {
var data = collection.findOne(id);
data.snapshot = Date.now();
data.delta = element;
collection.update(data);
db.saveDatabase();
} catch (error) {
console.error('dbOperations:update:failed,', error)
}
}
Could you please suggest me that, am I missing anything here.