A practical solution with low peak RAM usage can look like:
my_list <- list(1)
N <- length(my_list)
length(my_list) <- N + 9
for (i in 2:10) {
my_list[[N + i -1]] <- i
#gc() #Optional
}
You can use gc
to get the peak RAM usage. But this is much influenced whether there was a garbage collection or not during execution. To see the minimum possible peak gctorture
could be turned on, but then the execution time gets typical much slower. As the result could be influenced by the order how the methods are called I start each time a new vanilla session.
#Using append
n <- 1e5
gctorture(on=TRUE)
set.seed(0)
L <- list(sample(n))
gc(reset=TRUE)
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 285638 15.3 664228 35.5 285638 15.3
#Vcells 633121 4.9 8388608 64.0 633121 4.9
for (i in 2:10) L <- append(L, list(sample(n)))
gc()
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 344156 18.4 664228 35.5 345174 18.5
#Vcells 1215086 9.3 8388608 64.0 1265554 9.7
#Using [[<-
n <- 1e5
gctorture(on=TRUE)
set.seed(0)
L <- list(sample(n))
gc(reset=TRUE)
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 285638 15.3 664228 35.5 285638 15.3
#Vcells 633121 4.9 8388608 64.0 633121 4.9
for (i in 2:10) L[[length(L)+1]] <- sample(n)
gc()
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 346937 18.6 664228 35.5 347919 18.6
#Vcells 1221639 9.4 8388608 64.0 1272088 9.8
#Using [[<- but resizing the list before
n <- 1e5
gctorture(on=TRUE)
set.seed(0)
L <- list(sample(n))
gc(reset=TRUE)
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 285638 15.3 664228 35.5 285638 15.3
#Vcells 633121 4.9 8388608 64.0 633121 4.9
N <- length(L)
length(L) <- N + 9
for (i in 2:10) L[[N - 1 + i]] <- sample(n)
gc()
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 346564 18.6 664228 35.5 347498 18.6
#Vcells 1220761 9.4 8388608 64.0 1271479 9.8
Here append
needs 8.0 Mb and [[<-
8.2 Mb independent if the list size is increased before or not.
Doing the same but without gctorture
but manually using gc
after each step gives:
#Using append
n <- 1e5
set.seed(0)
L <- list(sample(n))
gc(reset=TRUE)
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 285638 15.3 664228 35.5 285638 15.3
#Vcells 633121 4.9 8388608 64.0 633121 4.9
for (i in 2:10) {L <- append(L, list(sample(n))); gc()}
gc()
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 344145 18.4 664228 35.5 372952 20.0
#Vcells 1215054 9.3 8388608 64.0 1319826 10.1
#Using [[<-
n <- 1e5
set.seed(0)
L <- list(sample(n))
gc(reset=TRUE)
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 285638 15.3 664228 35.5 285638 15.3
#Vcells 633121 4.9 8388608 64.0 633121 4.9
for (i in 2:10) {L[[length(L)+1]] <- sample(n); gc()}
gc()
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 346926 18.6 664228 35.5 377474 20.2
#Vcells 1221607 9.4 8388608 64.0 1352555 10.4
n <- 1e5
set.seed(0)
L <- list(sample(n))
gc(reset=TRUE)
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 285638 15.3 664228 35.5 285638 15.3
#Vcells 633121 4.9 8388608 64.0 633121 4.9
N <- length(L)
length(L) <- N + 9
for (i in 2:10) {L[[N - 1 + i]] <- sample(n); gc()}
gc()
# used (Mb) gc trigger (Mb) max used (Mb)
#Ncells 347659 18.6 664771 35.6 374526 20.1
#Vcells 1223042 9.4 8388608 64.0 1273592 9.8
Here append
needs 9.9 Mb, [[<-
without resizing the list in advance 10.4 Mb and when the list size is increased before 9.7 Mb.
In case you want to know the total amount of allocated but maybe in the meantime also freed memory or other options have a look at Monitor memory usage in R.