I need to push 1.18 MB or approximately 10,000 rows of data from a csv file residing on my server to a Google Sheet for Tableau to read.
The data is coming in from Google DFP into a csv document. I have been using the gspread library to update google sheets with the csv data before, however with 10,000 records and the ~30 seconds to post per record this approach will not be valid.
Is there a faster way to copy the contents of a csv/txt file to a google spreadsheet than by using the gspread library? Preferably with Python.
Update: I'm trying this approach of bulk updating the cells.
raw_dfp = pd.read_csv('live_dfp_UTF.csv', error_bad_lines=False)
sample5 = raw_dfp.iloc[:3, :]
rows, col = sample5.shape
doc.resize(1, 7)
doc.clear()
doc.resize(rows + 1, col)
column_names = ['', 'A', 'B', 'C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K']
cell_range = 'A1:' + column_names[col] + str(rows)
cells = doc.range(cell_range)
# To use this next line??
# flattened_data = np.flatten(sample5)
for x in range(rows):
cells[x].value = sample5[x].decode('utf-8')
doc.update_cells(cells)