0

I have a real time data intensive application that uses a local in app/memory cache

40,000 vehicles are sending data to 1 server (every 5 secs), I have to work out distance travelled between previous and current location.

To do this I cache each vehicles previous lat,lon, then when I see a new bit of data, I take the new lat,lon and work out the distance travelled between the points (i.e. 5 foot) and then add this to the accumulating odometer on the vehicle (i.e. 60,000 miles)

I need to start load balancing this to handle scale, Using a local cache would obviously be out of date when it hits the 2 different servers. however, using a distributed cache seems like I would massively slow down processing due to the network hop to a shared cache, especially with the volumes and frequency as mentioned above.

One solution could be using a sticky session so car A always goes to server A and periodically update the in memory cache in case a server goes down.

However I'm sure this problem has been solved in the past, Are there industry caching patterns to use in this scenario ?

Quade
  • 75
  • 9

1 Answers1

0

I am wondering how this went for you. I would have started with the sticky session with an in memory cache option, given the nature of the load. It appears that one vehicle can be assigned to a single server, and a local cache can track the previous lat, lng. Only thing once a car stops sending data, you need to be able to recognize that and release the server for the next car. Anyway curious to know how did it worked out. Interesting problem.

meow
  • 11
  • 4
  • hey, yes in the end I went for the sticky session to push that car to the same server, this works fine, and if something fails over we might have slightly older data on the new server but the system still runs, when i see a new device on new server it will first get the latest info, and then continue processing – Quade Sep 02 '21 at 04:06