I have several raspberry pi running mongoDB collecting local data. I need this data to sync to parse.com. I already have the code (in node.js) that reads from mongo and writes to parse.com.
So far, I ran the script every minute via cronjob. For every element that gets synced, the script writes a flag on mongo that says sync2Parse = true
.
The problem is that some times the sync takes longer than 1 minute, so the next execution of the sync script starts BEFORE the older one stops. This will mean that the CPU for each sync task is less, and thus delaying the sync job... ending in a circle of death.
The task is clear: Get the data from mongoDB as fast as possible to parse.com. What should be the best way to go? I don't need the code to do it, just the strategic advice on how to proceed.
I tried with mongo-watch but apparently it only works for mongoDB databases with replication set up.
Any ideas? thanks.