0

I have a huge list that I'm inserting into Redis, but it fails.

one_million = [18081681, 18081686, ....] // list of 1 million integers, 8-9 digits long
ten_million =  [18081681, 18081686, ....] // list of 10 million integers, 8-9 digits long

While inserting the one_million list works

redis.lpush('random', *one_million) // works

inserting the ten_million list doesn't

redis.lpush('random', *ten_million)

The above line gives this error

ConnectionError: Error 104 while writing to socket. Connection reset by peer.

But I can still insert 10 million items in chunks

for _ in range(10):
    redis.lpush('random', *one_million)

This works and Redis now has 10 million integers, taking a total of 50 MB

<elasticcache>:6379> memory usage random
(integer) 50048515
<elasticcache>:6379> llen random
(integer) 10000000

I'm using AWS ElasticCache, engine version 5.0.6. Weirdly enough, if I do this same thing - inserting 10 million in a single go - for the Redis setup I have on my local MacOS, it works. Apparently, there's something different between the ElasticCache configuration and my local Redis setup default configuration, something related to network limitations or timeouts maybe, but I can't figure out what.

I know about the 512 MB limitation of bulk strings in Redis, controlled by this paramemter - proto-max-bulk-len and also about list-max-listpack-size, which is about size limitations of each individual list element. But they don't seem to indicate anything for my use case.

Would greatly appreciate help in understanding this.

Sidharth Samant
  • 714
  • 8
  • 28

0 Answers0