0

I tried to minimizing -2likelihood using nlm function in R. One component of my likelihood function is a double integral which was calculated using adaptItegrate. With some pre-specified starting value, I got likelihood value quite fast (less than 1 min). However, when I use nlm function to optimizing, I took quite a long time (almost 1 hour) for the first iteration, and eventually I got error message about "memory re allocate". Anyone knows why this could be happened? And how can I fix this. This is my double integral function:

doubleint= function(y){
    h<- function(x){
      sig1*sig2*exp(-sig1*(y-x[1]))*exp(-sig2*(y-x[2]))*
        laplace(alpha1/beta1*(exp(beta1*y)-exp(beta1*x[1]))+alpha2/beta2*(exp(beta2*y)-exp(beta2*x[2])))
    }    
    sol=adaptIntegrate(h,lowerLimit=c(0,0),upperLimit=c(y,y),fDim = 1,doChecking=FALSE,tol=1e-5)$integral
    return(sol)
  }
Thaole
  • 9
  • 4
  • Increase the tolerance to the integration, or allow the process to use more memory. – Matthew Lundberg Aug 17 '14 at 17:36
  • @MatthewLundberg at some points after the first iteration, all of parameters become too extreme, that might cause trouble with the integral function. I put some constraint for parameters but after several running round, it just stopped and gave me the same error message. – Thaole Aug 19 '14 at 15:23

0 Answers0