0

I have several raspberry pi running mongoDB collecting local data. I need this data to sync to parse.com. I already have the code (in node.js) that reads from mongo and writes to parse.com.

So far, I ran the script every minute via cronjob. For every element that gets synced, the script writes a flag on mongo that says sync2Parse = true.

The problem is that some times the sync takes longer than 1 minute, so the next execution of the sync script starts BEFORE the older one stops. This will mean that the CPU for each sync task is less, and thus delaying the sync job... ending in a circle of death.

The task is clear: Get the data from mongoDB as fast as possible to parse.com. What should be the best way to go? I don't need the code to do it, just the strategic advice on how to proceed.

I tried with mongo-watch but apparently it only works for mongoDB databases with replication set up.

Any ideas? thanks.

otmezger
  • 10,410
  • 21
  • 64
  • 90
  • 1
    Instead of cron job use timeout. Take time before sync start and compare it to time after sync finished. If it is more than a minute sync again, if it isn't set timeout for the time remaining to a minute. So if sync took 20 seconds set timeout for 40 seconds. – Molda Nov 06 '15 at 15:14

1 Answers1

0

The solution was to pack all the code that syncs and then call it like this:

var minutes = 5, the_interval = minutes * 60 * 1000;
setInterval(function() {
  console.log("I am doing my 5 minutes check");
  // do your stuff here
}, the_interval);

Taken from: https://stackoverflow.com/a/8012484/1862909

Community
  • 1
  • 1
otmezger
  • 10,410
  • 21
  • 64
  • 90