I'm looking for a clean way to run simulations where each iteration depends on the result of the preceding iteration in a "functional style." For example, say we want to take a normally distributed sample of size 10 with mean meanInit
, take the mean of our sample and use that to generate another sample, and repeat the process. Using Reduce
, we can perform 100 iterations:
meanInit = 0
nextStep <- function(param, step) mean(rnorm(10, param))
Reduce(nextStep, 1:100, meanInit, accumulate = T)
However, this feels a bit abusive as our function nextStep
isn't truly a binary operation; indeed, we ignore the second argument step
. Is there an alternative method for performing a task that doesn't require us to "pretend" our function is a binary operation? I'm looking for a solution involving higher order functions. No for
loops please.
Perhaps this question is best stated in the form of an analogy:
sapply
is to replicate
as Reduce
is to ______?
That is, if we wanted to, for example, take 100 random normal samples of size 10 and calculate the mean, we could (ab)use sapply
with a function that ignores its argument as follows:
sapply(1:100, function(x) mean(rnorm(10)))
Or we could perform the same task cleanly (without writing a function that ignores its argument) using replicate
:
replicate(100, mean(rnorm(10)))
I'm looking for an analogous relative of Reduce
.