0

I've got an R script that reads data from a DB and writes it to a Google spreadsheet (the script reads several tables, each written to a different sheet of the spreadsheet). It works fine, but the problem I'm having is that since some of the tables are quite large, the googlesheets4::write_range function randomly gives a 'time-out' error because the sheet is still busy processing the previous tables written. I tried writing a code to catch the error and also using the retry (from package retry) function, but it doesn't seem to work, I just can't find an appropiate way to catch the error that the range_write function gives when timing out. What I need should look something like this (I guess):

retry::retry(
 range_write(my_sheet,
             my_data,
             sheet = sheet,
             range = "A1",
             col_names = TRUE),
 when = "some error",
 max_tries = 3,
 interval = 10)

As mentioned, the previous code works, but fails to catch the error that the range_write gives when timing out, it still gives the following: "Error: Server error: (503) UNAVAILABLE".

If after a few seconds I retry the function for the table that failed, it usually works, so I know that it is only a matter of retrying (if necessarily, 2 or 3 times), giving some seconds for the sheet to process the information.

Any help or tips to make this work will be immensely appreciated!

1 Answers1

0

SOLVED: running the script with try and saving it to a new variable solves the problem, like this:

result=try(range_write(my_sheet, my_data, sheet = sheet, range = "A1", col_names = TRUE))

That way writes the range if it works, but saves the error message in the 'result' variable if it fails. Then the result can be checked in a while function to repeat it if necessary.