You can get the upper bound of memory that is in use during the processing of a function and commands with gc
:
smallest.sv <- function(){
A <- matrix(rnorm(1e6), 1e3);
mysvd <- svd(A);
return(tail(mysvd$d, 1));
}
tt <- sum(.Internal(gc(FALSE, TRUE, TRUE))[13:14])
x <- smallest.sv()
sum(.Internal(gc(FALSE, FALSE, TRUE))[13:14]) - tt
#62 MB
rm(x)
This upper bound is influenced by garbage collection and so turning on gctorture
will give the lowest upper bound:
tt <- sum(.Internal(gc(FALSE, TRUE, TRUE))[13:14])
gctorture(on = TRUE)
x <- smallest.sv()
gctorture(on = FALSE)
sum(.Internal(gc(FALSE, FALSE, TRUE))[13:14]) - tt
#53.7 MB
Other tools like Rprof
, Rprofmem
, profmem::profmem
, bench::mark
or profvis::profvis
can also show the memory usage.
#Using Rprof (Enable profiling is a compile-time option: ./configure --enable-R-profiling)
gc()
Rprof("Rprof.out", memory.profiling=TRUE)
x <- smallest.sv()
Rprof(NULL)
max(summaryRprof("Rprof.out", memory="both")$by.total$mem.total)
#45.9
#Here at defined intervals the status is checked and so the result depends on if you hit the peak
#Using Rprofmem (Enable momory profiling is a compile-time option: ./configure --enable-memory-profiling)
Rprofmem("Rprofmem.out"); x <- smallest.sv(); Rprofmem(NULL) #Wen first run, there is much more in the log file
gc()
Rprofmem("Rprofmem.out")
x <- smallest.sv()
Rprofmem(NULL)
sum(as.numeric(read.table("Rprofmem.out", comment.char = ":")[,1]), na.rm=TRUE)
#88101752
#Writes out them memory amount when it is allocated
library(profmem) #uses utils::Rprofmem
gc()
total(profmem(x <- smallest.sv()))
#88101752
library(bench) #uses utils::Rprofmem
gc()
mark(x <- smallest.sv())[,"mem_alloc"]
#84MB
#Warning message:
#Some expressions had a GC in every iteration; so filtering is disabled.
library(profvis) #uses utils::Rprof
gc()
profvis(x <- smallest.sv())
#opens a browser window where you can read under Memory -23.0 | 45.9
Rprofmem
shows the memory that was cumulative allocated and does not consider the memory which was freed during execution. To increase the probability of Rprof
to hit the peak you can select a short time interval or/and repeat the procedure.
max(replicate(10, {
gc()
Rprof("Rprof.out", memory.profiling=TRUE, interval = runif(1,.005,0.02))
x <- smallest.sv()
Rprof(NULL)
max(summaryRprof("Rprof.out", memory="both")$by.total$mem.total)
}))
#76.4
Here I gott a higher value than I get from gc
, what demonstrates, that the memory usage is influenced by garbage collection and the upper bound of memory that is in use during the processing of a function may vary from call to call as long as gctorture
is not turned on.