I am having trouble with a relatively straight-forward ODE simulation in deSolve. I have previously run simulations with for() loops in R, but wanted to switch to a solver to improve some tipping point analyses. When I try to run the basic simulation, I get the following error: "Internal T (=R1) and H (=R2) are such that in the machine, T + H = T on the next step (H = step size)."
Thank you in advance for any help on this!
I have tried changing parameter values I set up my model function as such:
mod.fun <- function (t, x, params) {
# population sizes
B <- x[1]
N <- x[2]
C <- x[3]
F1 <- x[4]
E <- x[5]
# parameters
p <- params[1] #annual growth rate of host tree (B/N)
a <- params[2] #percent of annual productivity allocated to labile carbon pool (unit-less)
mB <- params[3] #annual mortality/maintenance (unit-less)
mN <- params[4] #(unit-less)
r <- params[5] #rate of resource sharing from tree to fungus in (1/F)
e <- params[6] #fungal resource conversion efficiency (F/C)
mF <- params[7] # fungal turnover rate annually (unit-less)
u <- params[8] #fungal uptake capacity (1/F)
s <- params[9] # percent of fungal resources shared to host (unit-less)
# change in population size over change in time
dBdt <- p * N - (a + mB)*B
dCdt <- a*B - (r*F1*C )
dF1dt <- e*r*F1*C-mF*F1
dNdt <- u*F1*s*E - mN*N
dEdt <- -u*F1*s*E + mN*N
# function outputs
return(list(c(dBdt, dBdt, dF1dt, dNdt, dEdt)))
}
Then set up the rest of the inputs for deSolve solvers and run lsoda:
N0 <- 25
E0 <- 100 -N0
B0 <- 50
C0 <- 20
F10 <- 20
# vector of initial values
initial_values <- c(N = N0, E = E0, B = B0, C = C0, F1 = F10)
p <- 1
a <- .40
mB <- .25
mN <- .25
r <- 1
e <- 1
mF <- 0.5
u <- 1
s <- 1
parameters <- c(p,a, mB, mN, r, e, mF, u, s)
times <- seq(from = 0, to = 60, length.out = 20000)
results <- deSolve::lsoda(initial_values, times, mod.fun, parameters)