0

Maybe this is a question for code review, but I think it's best suited here. Let me know if it needs to be moved.

I'm using Redis as what amounts to a long-term cache for some data in a Sinatra application. I'm iterating over all of the keys in Redis, pulling their data, and parsing it with JSON. It's taking quite a bit of time. here's the benchmark:

[13] pry(main)> Benchmark.measure do
[13] pry(main)*   dkeys = redis.keys.delete_if {|e| e == "last_run"}  
[13] pry(main)*   @delayed = dkeys.map {|k| {k => JSON.parse(redis.get(k))}}                                                                                                         
[13] pry(main)* end  
=>   0.520000   0.160000   0.680000 (132.410716)

[14] pry(main)> @delayed.count
=> 1358
[15] pry(main)> 

The delay is clearly in the map and I think the latency is the cost of calling redis 1300+ times.

Is there a way that I can pull all of the data from redis into an object so I won't have to call it on each step of the iteration?

Andrew Marshall
  • 95,083
  • 20
  • 220
  • 214
cmhobbs
  • 2,469
  • 3
  • 24
  • 30

1 Answers1

0

You should take a look at the answer @peterpan refers to. In particular, mget should do the trick:

dkeys = redis.keys.delete_if {|e| e == "last_run"}
@delayed = redis.mget(dkeys).map {|s| JSON.parse(s)}
Linus Thiel
  • 38,647
  • 9
  • 109
  • 104
  • This one runs substantially faster (finishes in just over 1 second) and gets me 90% there. @peterpan's comment was incredibly helpful as well. – cmhobbs Dec 03 '12 at 15:09
  • Good to hear -- I see now that you actually wanted a map with key => (JSON parsed) value. I'm not so well versed in Ruby, sorry about that but glad that the answer was helpful! – Linus Thiel Dec 03 '12 at 16:01
  • Yeah, I'm using Array#zip to kind of mangle a Hash together right now with some success. At any rate, the answer worked quite well. – cmhobbs Dec 03 '12 at 20:26