0

There is a memory-management process on a server I use that kills processes that go above a certain limit.

There are very large data files that must be loaded in their entirety, but doing so goes above the memory limit.

Is there a way to force R to limit the amount of memory on RAM and use swap-space/a swapfile after it is hit?

Max Candocia
  • 4,294
  • 35
  • 58
  • 1
    In general it's a function of the memory on the server. On (64-bit) Linux I do not set explicit values but managed to get my sessions killed when trying to read data sets that required more ram than I had (even iterating a few times on cloud instances, increasing ram and retrying ...). And it is the server and operating system who decide when to turn on swap use. – Dirk Eddelbuettel Jun 09 '21 at 16:34
  • I haven't done that level of (low-level) memory management in a while, but I don't know that applications can (or, if it is possible, *choose* to) know if any portion of their mapped memory is entirely within real-RAM or partially in swap space; in fact, I believe one premise of OS-managed virtual memory is that the applications need not know or care about it. Since you're dealing with "large data", I suggest you look into alternatives that are more memory-frugal, such as `data.table`, `disk.frame`, or a DBMS of some sort (certainly other similar options exist). – r2evans Jun 09 '21 at 16:57

0 Answers0