3

I am running code in gganimate to create an animation using a large amount of data.

I have 32GB RAM but it just isn't quite enough, I get the error 'Cannot allocate vector size XXX'. I can get it to run if I make the data smaller (using moving averages) but I would ideally like it at a higher quality.

I would have thought there would be a way to bi-pass the RAM and move it to SSD instead (where I have 600GB of storage available) or at least use the RAM and then use SSD when RAM is reaching capacity. The time it takes for the animation to render is not important as I can let it run while I do other things.

I have experimented with AWS as well but they seem quite complicated and I don't really want to spend money if there is a free option available.

Or perhaps there is a way in gganimate to not do everything in one go but to have the animation render in stages so capacity isn't an issue.

I have had in the past the suggestion to use another program/package that can handle the larger data but I have already invested a lot of time in this complicated code in R, and it does exactly what I want except for the RAM issue, that moving to something in say Python would be a huge waste of time. I'd only be willing to do this as a last resort.

Thanks for any help!

user3456588
  • 609
  • 4
  • 9

0 Answers0