I am building an application in Google App Maker that takes in a user-input Excel CSV file with 3 columns and 370,573 rows, so in total 1,111,719 data values. I am trying to efficiently input this data into a MySQL database by sending Batch Requests. However, I am unsure of how to properly optimize this process to minimize the amount of time it takes. This is how I am currently completing the process:
var file = DriveApp.getFileById(fileID);
var data = Utilities.parseCsv(file.getBlob().getDataAsString());
var stmt = conn.prepareStatement('INSERT INTO report '
+ '(createdDate, accountFullID, lsid) values (?, ?, ?)');
for(var i = 1; i < **data.length**; i++) {
stmt.setString(1, data[i][0]);
stmt.setString(2, data[i][1]);
stmt.setString(3, data[i][2]);
stmt.addBatch();
}
var batch = stmt.executeBatch();
conn.commit();
conn.close();
When testing my code, it took upwards of 3 minutes to complete when I set the for-loop to iterate until variable i was less than 500. When I set the value to a small number like 5, it took several seconds to complete. When I set the value to data.length (as it is currently set to in bold), it never completed and timed out with a deadlock exception. How should I edit my code in order to more efficiently execute batches and reduce the total amount of time it takes when inputting all the data entries from the Excel CSV file, not only a small portion of the spreadsheet?