0

Is there some mechanism for working with data in high concurrency?

First we used mongodb and it has atomic updates that solving problem. But updates freq grossing to about 1000\seconds and we setup Redis to help mongo and writed syncronisation between them. It works good, but we have concerrency problem with redis.

For example:

  1. First Request came at 0.01ms - process exits at 0.04ms
  2. Second Request came at 0.02ms and exits at 0.03s.

Both requests get same object? change it's data and save it on exit.

When we used mongodb - we can do - partial updates on object, but with redis - we cannot.

Is it possible to manipulate with same object(data) from multiple process at the same time and not overwrite it whole - only part?

The only way i find - create lock mechanism and wait process while it exists before get it second time.

vuliad
  • 2,142
  • 3
  • 15
  • 16

1 Answers1

1

Redis doesn't have such a mechanism(partial update) as you want yet, but as an alternative, you can write Lua scripts to avoid the concurrency issues.

In the script, you can read a value first, manipulate it as you want, finally store the value again. Redis ensures the script would be executed atomically, so that you don't need any lock mechanism with this scenario. (*)

ALittleDiff
  • 1,191
  • 1
  • 7
  • 24