I tried to minimizing -2likelihood using nlm function in R. One component of my likelihood function is a double integral which was calculated using adaptItegrate. With some pre-specified starting value, I got likelihood value quite fast (less than 1 min). However, when I use nlm function to optimizing, I took quite a long time (almost 1 hour) for the first iteration, and eventually I got error message about "memory re allocate". Anyone knows why this could be happened? And how can I fix this. This is my double integral function:
doubleint= function(y){
h<- function(x){
sig1*sig2*exp(-sig1*(y-x[1]))*exp(-sig2*(y-x[2]))*
laplace(alpha1/beta1*(exp(beta1*y)-exp(beta1*x[1]))+alpha2/beta2*(exp(beta2*y)-exp(beta2*x[2])))
}
sol=adaptIntegrate(h,lowerLimit=c(0,0),upperLimit=c(y,y),fDim = 1,doChecking=FALSE,tol=1e-5)$integral
return(sol)
}