I am not sure if I use the do.call
the right way:
test <- function(test) {
string <- deparse(substitute(test))
start <- regexpr("\\(", string)
end <- regexpr(")", string) - 1
distribution <- substr(string, 0, start-1)
string.arguments <- substr(string, start+1, end)
v <- read.table(text=unlist(strsplit(string.arguments, ",")))
list.arguments <- lapply(t(v), function(x) x)
for (i in 1:1000000) {
do.call(distribution, list.arguments)
}
}
The goal here is to be able to send a distribution, such as rnorm
and rgamma
, followed by arguments to a function, instead of an evaluated function.
Here is a comparison of using do.call and just simply calling the function:
> system.time(test(rnorm(100, 1, 10)))
user system elapsed
17.772 0.000 17.820
> system.time(for(i in 1:1000000) { rnorm(100,0,1)} )
user system elapsed
13.940 0.004 14.015
The question is twofold:
- Does do.call really have to take 20% longer?
- Is this the right approach to accept varying distributions and arguments?