1

A redis server v2.8.4 is running on a Ubuntu 14.04 VPS with 8 GB RAM and 16 GB swap space (on SSDs). However htop shows that redis alone is taking up 22.4 G of memory!

I do not think the redis database is this large, so why is it taking up so much memory?

enter image description here enter image description here

Redis version: Redis server v=2.8.4 sha=00000000:0 malloc=jemalloc-3.4.1 bits=64 build=a44a05d76f06a5d9

After restarting redis-server

enter image description here enter image description here


Update

redis-server eventually crashed due to out of memeory. Mem and Swp both hits 100% then redis-server is killed along with other services.

From dmesg:

[165578.047682] Out of memory: Kill process 10155 (redis-server) score 834 or sacrifice child
[165578.047896] Killed process 10155 (redis-server) total-vm:31038376kB, anon-rss:5636092kB, file-rss:0kB

I guess we really should worry about redis's memory usage getting higher over time! How can we troubleshoot this?

Nyxynyx
  • 1,459
  • 11
  • 39
  • 49
  • That is not what the VIRT column means. You need to look at the RES column. – Michael Hampton Jul 27 '14 at 00:12
  • @MichaelHampton Thanks, read up on `RES` and `VIRT`. Should I be worried that `VIRT` is 22 GB which seems extremely large? What i've read says that `VIRT` is irrelevant and we should only look at `RES`. But `RES` is 5 GB before the restart and shrank to 68 MB after restarting, is this normal? – Nyxynyx Jul 27 '14 at 01:21
  • @MichaelHampton `redis-server` promptly crashes when it keeps hogging more and more memory/swap... – Nyxynyx Jul 27 '14 at 20:04
  • Yes, that's what it does when you store too much data in it. Either store less data or give the server more memory. – Michael Hampton Jul 27 '14 at 20:11
  • @MichaelHampton I dont think the data set gets so large, its probably <1GB. Restarting redis brings its memory usage back down to 68MB. If redis is not restarted, it just keeps using more memory. – Nyxynyx Jul 27 '14 at 20:44
  • 1
    I'm having the same problem, a dataset of 100MB and Redis grows to 3-4GB and is eventually killed by OOM. My problem is that even when I restart it, it goes up to that amount of RAM very fast... – Manuel Meurer Nov 13 '14 at 17:02
  • @ManuelMeurer what does your Redis `INFO` command show? – Itamar Haber Nov 13 '14 at 19:14
  • @ItamarHaber https://gist.github.com/manuelmeurer/8c660be28534f8332a23 Any hints in there what might be wrong? – Manuel Meurer Nov 14 '14 at 08:46
  • @ItamarHaber Added it as a question here: http://serverfault.com/questions/644309/redis-eats-up-more-and-more-memory – Manuel Meurer Nov 14 '14 at 09:16

1 Answers1

1

You should check your redis.conf for setting called 'maxmemory'. If you don't want Redis to use more then 100MB of memory, then be sure to set the following in your redis.conf:

maxmemory 104857600

After applying change you will have to restart your redis instance.

Jakov Sosic
  • 5,267
  • 4
  • 24
  • 35
  • If I understand correctly this will make redis into a cache, expiring older keys to fit into the memory defined by `maxmemory`. This will cause me to lose data... – Nyxynyx Jul 27 '14 at 14:03
  • Yes, it will. See here: http://redis.io/topics/config (section "Configuring Redis as a cache") – Karl Wilbur Oct 15 '16 at 01:00