My Requirement :
When ever there is a data change in a table(s) (whether insert,update,delete) , i should be able to update my cache using my logic which does manipulation using the table(s).
Technology : Node, rethinkDb
My Implementation :
I heard of table.changes()
in rethinkDb, which emits a stream of objects representing changes to a table.
I tried this code
r.table('games').changes().run(conn, function(err, cursor) {
cursor.each(console.log);
});
Its working fine, i mean i am getting the events in that i put my logic for manipulations.
My Question is for how long it will emit the changes .. I mean is there any limit. And how it works ?
I read this in their doc,
The server will buffer up to 100,000 elements. If the buffer limit is hit, early changes will be discarded, and the client will receive an object of the form {error: "Changefeed cache over array size limit, skipped X elements."} where X is the number of elements skipped.
I didn't understand this properly. I guess after 100,000 it wont give the changes in the event like old_vale and new_value.
Please explain this constraint and also as per requirement will this work ?
I m very to this technology. Please help me.