0

When database is larger than 5120000 bytes Redis fails it to save and crushes.

I run it on Ubuntu Server 14.04, Redis v.3.0.1.

I updated /etc/sysctl.conf

vm.overcommit_memory = 1

net.core.somaxconn=1024

and /etc/rc.local

echo never > /sys/kernel/mm/transparent_hugepage/enabled

as it was recommended in redis log.

I insert test data total of around 200MB with 80000 keys

then from command-line I execute BGSAVE.

A message Background saving started is returned and

* Background saving started by pid 6535

# Background saving terminated by signal 25

is logged

then I execute SAVE from command-line it creates

5120000 Dec 22 17:24 temp-4975.rdb file and crashes:

Could not connect to Redis at 127.0.0.1:6379: Connection refused

When dumping smaller amount of data everything goes well.

Is there a limit of 5120000 bytes per file? What should I do to avoid this?

user3376996
  • 111
  • 1
  • 7
  • See last response on this thread: https://groups.google.com/forum/#!topic/redis-db/bNZNp8xiR8A It says you should increase your `maxmemory` settings. – Niloct Dec 22 '16 at 15:57
  • ```maxmemory``` is set to ```1536mb``` – user3376996 Dec 23 '16 at 07:43
  • Did you reboot your server after changing the configuration? You have only described changes in config files, or files used at start time. – Didier Spezia Dec 26 '16 at 09:48
  • @Didier Spezia all changes applied, on server's boot I got ```cat /sys/kernel/mm/transparent_hugepage/enabled``` ```always madvise [never]``` ```cat /proc/sys/vm/overcommit_memory``` ```1``` ```/proc/sys/net/core/somaxconn``` ```1024``` – user3376996 Dec 28 '16 at 10:21

0 Answers0