I sometimes work with lots of objects and it would be nice to have a fresh start because of memory issues between chunks. Consider the following example:
warning: I have 8GB of RAM. If you don't have much, this might eat it all up.
<<chunk1>>=
a <- 1:200000000
@
<<chunk2>>=
b <- 1:200000000
@
<<chunk3>>=
c <- 1:200000000
@
The solution in this case is:
<<chunk1>>=
a <- 1:200000000
@
<<chunk2>>=
rm(a)
gc()
b <- 1:200000000
@
<<chunk3>>=
rm(b)
gc()
c <- 1:200000000
@
However, in my example (which I can post because it relies on a large dataset), even after I remove all of the objects and run gc()
, R
does not clear all of the memory (only some). The reason is found in ?gc
:
However, it can be useful to call ‘gc’ after a large object has
been removed, as this may prompt R to return memory to the
operating system.
Note the important word may
. R
has a lot of situations where it specifies may
like this and so it is not a bug.
Is there a chunk option according to which I can have knitr
start a new R
session?