1

We are using officer to generate reports (about 100 pages with many graphs and tables) automatically. If we run the chapters separately, each file runs very fast.But when running all 12 files together this takes up to an hour. We assume that all results are stored in the working memory which causes the problem. Printing the document after each chapter or using rm() to remove all objects that are no longer needed in between has no effect on the processing time.
Any ideas what can be done to "clear“ the working memory or speed up the process?
Here's an abstract of our code:

doc_output = file.path("C:/Doc/report.docx")  

doc = read_docx(path = doc_template)  

source(paste0("Chapter-1_", year, ".R"))  
print(doc, target = doc_output)  
rm(list = ls()[! ls() %in% c("year", "data_all", "data", "doc_template", "doc_output", "doc")])  
gc()  

source(paste0("Chapter-2_", year, ".R"))  
print(doc, target = doc_output)  
rm(list = ls()[! ls() %in% c("year", "data_all", "data", "doc_template", "doc_output", "doc")])  
gc()  

[...]  

source(paste0("Chapter-11_", year, ".R"))  
print(doc, target = doc_output)  
rm(list = ls()[! ls() %in% c("year", "data_all", "data", "doc_template", "doc_output", "doc")])  
gc() 
Jurelisa
  • 11
  • 2
  • Please provide a minimal reproducible example. – Christoph Jul 09 '19 at 06:47
  • Can you share the way you're generating the reports? What do you mean with 'all 12 files together'? – Sven Jul 09 '19 at 07:15
  • @Sven: We are using the source-function to run all the 12 files one after another. If we restart R after runnig one chapter, the following chapter runs very fast. But if we run all the files one after another, R becomes extremely slow. – Jurelisa Jul 09 '19 at 13:02

0 Answers0