0

I ran a large amount of calculations in R (using RStudio) and I'm confused about the state that this has left R in. Specifically, after I ran

rm(list=ls(all.names = T))

and then

mem_used()

I am seeing that around 400 megabytes of space is being used. Inspecting more closely with

gc(verbose = T)

I see the following:

Garbage collection 27693 = 26029+1296+368 (level 2) ...

95.5 Mbytes of cons cells used (31%)

331.6 Mbytes of vectors used (33%)

used (Mb) gc trigger (Mb) max used (Mb)

Ncells 1787624 95.5 5684620 303.6 5684620 303.6

Vcells 43457011 331.6 132000253 1007.1 262740244 2004.6

The question that I have is how do I figure out what is using all this space? Specifically, the 330 Mb of vector storage is confusing to me given that I just rm'd all the user allocated objects. Is it reasonable for R's internal structures to be consuming this much space?

Thanks for any help that anyone can provide.

chuck taylor
  • 2,476
  • 5
  • 29
  • 46
  • 1
    The memory usage is one of the most annoying issues of Rstudio. Clearing the workspace doesn't help, restarting Rstudio does. – RHA Mar 23 '16 at 19:21
  • @RHA is so right. RStudio randomly hits peaks in memory usage. The only way I have had success getting it back down is resetting. – Cayce K Mar 23 '16 at 20:52
  • I seriously doubt that rstudio is the culprit as much as R. Check out the links to the right of this question for various explanations / solutions. – lmo Apr 10 '16 at 19:26

0 Answers0