I'm working on a project with a rather large workspace. Unfortunately I can't save the workspace and it freezes. If I have a small workspace I can do save.image()
with just a dataframe
>library(dplyr); library(tidyr);library(tidyverse);library(tidytext);library(pryr)
>master = readRDS("data")
> pryr::object_size(master)
527 MB
>save.image(safe=F)
> pryr::mem_used()
682 MB
> memory.limit()
[1] 8142
it takes like 10 seconds but it saves the 116 MB compressed .Rdata file just fine. Also if I try save.image(compress=F)
it takes less than a second.
> master_tidy = master %>% unnest_tokens(word, text)
> pryr::object_size(master_tidy)
565 MB
> pryr::mem_used()
758 MB
And now if I try to run save.image()
or save.image(compress=F)
it will get stuck and I have to terminate R as the stop request doesn't work either. If I run task manager I do see that while R is stuck it uses 100+ MB/s Disk and 2% (depends on type of compression) CPU but even after 15 minutes it's still running save.image()
. Also I see the .RdataTmp files in the directory and have tried save.image(safe=F)
to no avail. I find it strange that after I unnest_tokens()
I can no longer use save.image()
, however I can't recreate this example using the shakespear tidytext example so I'm not sure what the problem is.