Maybe this is a question for code review, but I think it's best suited here. Let me know if it needs to be moved.
I'm using Redis as what amounts to a long-term cache for some data in a Sinatra application. I'm iterating over all of the keys in Redis, pulling their data, and parsing it with JSON. It's taking quite a bit of time. here's the benchmark:
[13] pry(main)> Benchmark.measure do
[13] pry(main)* dkeys = redis.keys.delete_if {|e| e == "last_run"}
[13] pry(main)* @delayed = dkeys.map {|k| {k => JSON.parse(redis.get(k))}}
[13] pry(main)* end
=> 0.520000 0.160000 0.680000 (132.410716)
[14] pry(main)> @delayed.count
=> 1358
[15] pry(main)>
The delay is clearly in the map and I think the latency is the cost of calling redis 1300+ times.
Is there a way that I can pull all of the data from redis into an object so I won't have to call it on each step of the iteration?