0

I have to Insert millions of row of mysql Database into redis in python. I am currently using pipline for the same but It is talking too much time and memory. Can you suggest the better way to implement this.

keshaw
  • 1,215
  • 2
  • 10
  • 13
  • Could you paste some of your code? How are you getting your rows from the mysql database? – Arnaud Potier Mar 10 '16 at 11:29
  • using Select * from table_name and to load in redis I am using this code import redis def load(pdtDict): redIs = redis.Redis() pipe = redIs.pipeline() for key in pdtDict.keys(): pipe.hmset(self.seller+":"+str(key),pdtDict[key]) pipe.execute() – keshaw Mar 10 '16 at 11:37
  • Some links that might help you; See here: (https://groups.google.com/forum/?fromgroups=#!searchin/redis-db/pipeline/redis-db/D4V6kDJNDsI/1-vqnsCxJ18J) and here: (http://stackoverflow.com/a/7505508/2759336) – Tw Bert Mar 10 '16 at 12:18
  • Can you explain how to insert in batch ? – keshaw Mar 10 '16 at 12:23
  • Edit your question with sample source data, you may get better help with that. – woozyking Mar 28 '16 at 22:43

0 Answers0