0

I'm trying to wrangle data from a shared excel file in one drive. I have synced one drive to my macbook so I'm easily able to access it in r with a path. I have the following code:

my_data <- "Users/B.../Folder/my_file.xlsx"
excel_sheets(path = my_data)

I'm not sure if more of my code would be necessary for this question, but I just go on to organize the data:

str(list_all)

sheets <- excel_sheets(path = my_data)

list_all_2 <- lapply(sheets, function(x) read_excel(path = my_data, sheet = x))

str(list_all_2)

my_data_2 <- rbind.fill(list_all_2)
str(my_data_2)

My issue comes at this point. As I said, I'm using data from a shared excel file and I want to be able to run this code a few times for analyses that will reflect the newly changed/added data. When I go into the excel file and change something, it isn't reflected in the df in my working directory so I'm not sure it's working/ways in which to fix this.

Phil
  • 7,287
  • 3
  • 36
  • 66
lerm16
  • 21
  • 5
  • @stefan_aus_hannover sorry could you explain more – lerm16 Jan 20 '22 at 17:04
  • A script is going to run once. Are you trying to trigger a rerun of the script when the excel file is saved? I think you'll need some sort of additional tool to stay running and check for changes. – camille Jan 20 '22 at 20:04
  • Are you saying that you are re-running the script AFTER you have made the change, and it still seems to be the same data as you had done BEFORE the change is made? – Phil Jan 20 '22 at 21:49
  • @Phil yes exactly! – lerm16 Jan 21 '22 at 15:29
  • @camille, yes this could be. I'm fairly new to this so I thought by re-running it it would be re-doing and re pulling the data and would therefore be updated – lerm16 Jan 21 '22 at 15:30

0 Answers0