0

I have events table with multiple records.

Columns - id, start_time, end_time ...

I have to fetch the analytics for all the live events (which can be thousands at a certain time) repeatedly by third party API calls, which can take one event at one time. I have to do this repeatedly until the event ends for each live event. Let's say the minimum time for an event to fetch analytics is every 15 minutes.

Third Party API calls need to be sequential.

I am open to use any tool e.g. Redis.

What are the efficient ways I can have this?

I need to have something like LRU system with repetition, but don't exactly know how to implement.

Ken White
  • 123,280
  • 14
  • 225
  • 444
Rahul Patel
  • 639
  • 6
  • 12

1 Answers1

0

One efficient way to achieve this would be to use an asynchronous method such as the async library's mapSeries function in combination with Redis set command.

Here is an example of how you could use the mapSeries function and redis to make API requests for all the matching IDs in a table:

const async = require("async");
const redis = require("redis");

const client = redis.createClient();
const IDS = getIdsFromTable(); // function to get the ids that match the 
start_time and end_time filter

async.mapSeries(IDS, (ID, callback) => {
// Make API request with ID
// ...
client.set(ID, JSON.stringify(result), function(err, reply) {
if(err){
  console.log(err);
}
console.log(reply);
});
// Once the API request is complete, call the callback function
// to move on to the next ID
callback(null, result);
}, (err, results) => {
// All API requests have completed and saved in redis
// `results` is an array of all the responses
been });

You can set a timeout for 15 minutes for each request so that if any request takes more time than expected it will not block the execution and also handle the error if any. It is also important to consider the data expiration time in Redis, if you don't need the data after a certain time, you can set a time-to-live (TTL) on the keys.

Nehad Awad
  • 39
  • 3