I have one redis node already, now the momery is cost much and i want to migrate this to more instance. if i load the rdb in every instance, it must be waste a lot. I can use "keys " to get all key, and then sharding the data by performing a consistent hash upon the key, but it is very ugly...Any beatiful solution?
Asked
Active
Viewed 173 times
1 Answers
1
If you had redis running in with AOF configured you could simply write a script that reads the AOF. That's what we did. With redis in RDB configuration i don't think you can do that.
If you're running redis 2.6 you could use the migrate command:
http://redis.io/commands/migrate
You would still need to iterate over the keys with KEYS
.
I don't think there is a better solution that iterating over all keys with KEYS
and then consistently hash them.
If you're not afraid of communicating with redis directly you could process the bulk reply of KEYS *
while reading it, to speed things up and save memory.

Jonas Adler
- 905
- 5
- 9
-
I've tried in aof and rdb both. All of it works well.Finally i choose aof because it can be done in increase mode. – zhouzuan2k Dec 03 '12 at 14:49