I am running a couple of spiders in parallel by scrapyd 1.2. Each process will raise the Buffer during the crawl significantly as seen in the chart. What is this value and how can I reduce the footprint?
Asked
Active
Viewed 381 times
1 Answers
2
Linux will use available memory for various caches, mostly file related. slabtop
command to see details.
how can I reduce the footprint?
You don't. These will be evicted quickly and automatically if needed.
Further, it is not yet a concern. 1 GB and change free on a 4 GB system is a significantly sized chunk of unused RAM.

John Mahowald
- 32,050
- 2
- 19
- 34
-
Well, I updated the V-Box to 4GB from 2GB. Looking at the footprint I am asking myself if this was really needed. My question is, will it reduce the performance of the machine if I go back to 2GB since the blue bar is unclear to me. – merlin May 26 '20 at 22:35
-
2 GB total, with a load similar to the 4 GB graph, would start reclaiming from cache as free decreases. Only testing can tell if the performance is adequate for your requirements. I predict it will be acceptable. – John Mahowald May 26 '20 at 23:24
-