0

I trying to optimize my model with 30000 variables and 1700 contraints, but i got this error´s when i put some more contraints.

n<-lp ("max", f.obj, f.con, f.dir, f.rhs)$solution
Error: cannot allocate vector of size 129.9 Mb

I´m working in win 32 bit, 2gb ram. What can i do to work and optimize my model using a large dataset?

Forstools
  • 15
  • 3
  • Depending on where your problem comes from, you may be able to decompose it (approximately) into a set of smaller problems. This is the case, for instance, for linear programs coming from some stochastic optimization problems ([progressive hedging](http://mpc.zib.de/index.php/MPC/article/download/85/39)). – Vincent Zoonekynd Aug 29 '13 at 20:10

1 Answers1

1

That's a tiny machine by modern standards, and a non-tiny problem. Short answer is that you should run on a machine with a lot more RAM. Note that the problem isn't that R can't allocate 130 MB vectors in general -- it can -- it's that it's run out of memory on your specific machine.

I'd suggest running on a 64-bit instance of R 3.0 on a machine with 16 GB of RAM, and see if that helps.

You may want to look into spinning up a machine on the cloud, and using RStudio remotely, which will be a lot cheaper than buying a new computer.

Harlan
  • 18,883
  • 8
  • 47
  • 56