I upload a .csv of data to Google Sheets via gspread, but the file has grown large enough (51MB right now) that it seems like the script times out before it can fully upload. I'm working under this assumption because bringing the file down to 30MB (by deleting some columns) seems to make the upload work fine.
I saw this answer by @Lavigne958, that mentions that timeouts can now be modified in gspread, but I am not sure how to go about doing that (and I don't have enough reputation to comment). How to solve gspread API ReadTimeout limit (Python)?
Would anyone know how I can go about setting the timeout on my requests?
My specific error is:
ReadTimeout: HTTPSConnectionPool(host='sheets.googleapis.com', port=443): Read timed out. (read timeout=120)
My CSV import function is:
def CSV_Import(csv_file='', gsheet_file_id='', gsheet_tab_name = ''):
scope = ["https://spreadsheets.google.com/feeds", 'https://www.googleapis.com/auth/spreadsheets',
"https://www.googleapis.com/auth/drive.file", "https://www.googleapis.com/auth/drive"]
credentials = ServiceAccountCredentials.from_json_keyfile_name('googlecredentials.json', scope)
client = gspread.authorize(credentials)
sh = client.open_by_key(gsheet_file_id)
sh.values_update(
gsheet_tab_name,
params={'valueInputOption': 'USER_ENTERED'},
body={'values': list(csv.reader(open(csv_file)))},
)
My call is:
CSV_Import(
csv_file="file_to_upload_here",
gsheet_file_id="gsheet_file_id_here",
gsheet_tab_name="gsheet_tab_name_here"
)