We are using Google Drive v3 API to manage documents in our web application. We have a simple use case in which the user clicks on a button and the backend needs to copy about 5-10 files from source
to destination
folder. I have tested with 6 files in the source folder and the API took about 7 seconds. I have used batching to invoke the copy file API. Following is the code for the same:
Adding requests to the Queue:
for(Template template: templates) {
File file = new File();
file.setParents(Collections.singletonList(parentFileId));
file.setName(template.getName());
file.setWritersCanShare(false);
file.setViewersCanCopyContent(false);
Map<String, String> appProperties = new HashMap<>();
appProperties.put(TEMPLATE_CODE_PROP_NAME, template.getCode());
file.setAppProperties(appProperties);
driveService.files().copy(template.getFileId(), file)
.setFields("id, name, appProperties, webViewLink, iconLink, mimeType")
.queue(batch, callback);
}
Handle response after the batch is executed successfully:
JsonBatchCallback<File> callback = new JsonBatchCallback<File>() {
@Override
public void onSuccess(File file, HttpHeaders responseHeaders) throws IOException {
log.info("Copied file successfully - " + file.getName() + " " + file.getId());
}
@Override
public void onFailure(GoogleJsonError e, HttpHeaders responseHeaders) throws IOException {
log.severe("Failed to copy file " + e.getCode() + " " + e.getMessage());
throw new Exception();
}
};
I have followed the best practices recommended by Google:
- Set fields that are required in the response so that we get partial response instead of complete response
- Use batching for invoking the API
The API is taking 7 seconds to complete this simple task. This is a very bad performance from user experience perspective. I would like to know if this is the expected delay or am I doing something wrong here ?