9

I did some evaluations on CouchDB recently. I found that memory consumption is pretty high for view construction (map & reduce) as well as importing a larger JSON document into CouchDB. I evaluated the view construction function on a Ubuntu system (4 cores, Intel® Xeon® CPU E3-1240 v5 @ 3.50GHz). Here are the results:

  1. four hundred 100KB datasets would cost around 683 MB memory;
  2. one 80 MB dataset would cost around 2.5 GB memory;
  3. four 80 MB datasets would cost around 10 GB memory.

It seems that memory consumption is hundreds of times of original JSON dataset. If we use 1 GB dataset, then CouchDB would run out of the memory. Does anyone know the reason why memory consumption is so huge? Many thanks!

wscourge
  • 10,657
  • 14
  • 59
  • 80
Jack
  • 101
  • 3

2 Answers2

1

I don't know why the memory is so high, but I know it's consistent with CouchDB and you can't really get around it as long as you have large document sizes. I eventually split out the data that I wanted to build views on and then kept the full documents in a separate database for later extraction.

MitchB
  • 41
  • 1
  • 2
1

I know that late to answer but I'll leave this answer for someone to benefit. Actually, it's about the caching responses. Couchdb wants to cache the responses to return the results faster. You can handle the issue by setting the caching limits.

Check it: https://docs.couchdb.org/en/latest/config/couchdb.html

Emir Cangır
  • 13
  • 1
  • 4
  • 1
    Can you please add an example how to set the cache limits? The link does explain anything and does not even reference something like a CouchDB cache. – Tino Jun 23 '23 at 19:56