I am visiting a restaurant that has a menu with N dishes. Every time that I visit the restaurant I pick one dish at random. I am thinking, what is the average time until I taste all the N dishes in the restaurant?
I think that the number of dishes that I have tasted after n visits in the restaurant is a Markov chain with transition probabilities:
p_{k,k+1} = (N-k)/N
and
p_{k,k} = k/N
for k =0,1,2,...,N
I want to simulate this process in R. Doing so (I need help here) given that the restaurant has 100 dishes I did:
nits <- 1000 #simulate the problem 1000 times
count <- 0
N = 100 # number of dishes
for (i in 1:nits){
x <- 1:N
while(length(x) > 0){
x <- x[x != sample(x=x,size=1)] # pick one dish at random that I have not tasted
count <- count + 1/nits
}
}
count
I want some help because my mathematical result is the the average time is N*log(N) and the code above produces different results.