1. About For the 2nd API request
If this is for Retrieve a job status, it seems that the sample curl command is as follows.
curl https://api.intercom.io/export/content/data/v5xsp8c19ji0s82 \
-X GET \
-H 'Authorization:Bearer <Your access token>'
-H 'Accept: application/json'
It seems that the value of job_identifier
can be retrieved from your 1st request shown in your question. Ref
When this request is done for the first time, it seems that status
returns pending
. By this, it seems that until status
is changed to complete
, it is required to wait. Ref
2. About For the 3rd API request
If this is for Retrieve the exported data, it seems that the sample curl command is as follows.
curl https://api.intercom.io/download/content/data/xyz1234 \
-X GET \
-H 'Authorization:Bearer <Your access token>' \
-H 'Accept: application/octet-stream’
In this case, the document says Your exported message data will be streamed continuously back down to you in a gzipped CSV format.
. I thought that in this case, the returned data might be able to be ungzipped with Utilities.ungzip
, and the ungzipped data might be able to be parsed with Utilities.parseCsv
.
3. Using your 3 requests, how about the following sample script?
function exportjob2() {
var accessToken = "###your access token###"; // Please set your access token.
// 1st request: This is from https://stackoverflow.com/a/73032528
var url1 = 'https://api.intercom.io/export/content/data';
var options1 = {
"method": "post",
"headers": {
"Authorization": "Bearer " + accessToken,
"Accept": "application/json",
},
"contentType": "application/json",
"payload": JSON.stringify({
"created_at_after": 1654041600,
"created_at_before": 1656547200
})
}
var response1 = UrlFetchApp.fetch(url1, options1);
var { job_identifier } = JSON.parse(response1.getContentText());
// 2nd request <--- Modified
var url2 = 'https://api.intercom.io/export/content/data/' + job_identifier;
var options2 = {
"headers": {
"Authorization": "Bearer " + accessToken,
"Accept": "application/json",
},
}
var response2 = UrlFetchApp.fetch(url2, options2);
// console.log(response2.getContentText()); // for debug.
var { download_url, status } = JSON.parse(response2.getContentText());
while (status == "pending") {
Utilities.sleep(5000); // Please adjust this value. The current wait time is 5 seconds.
response2 = UrlFetchApp.fetch(url2, options2);
// console.log(response2.getContentText()); // for debug.
var obj = JSON.parse(response2.getContentText());
status = obj.status;
download_url = obj.download_url;
}
if (!download_url) {
throw new Error("download_url has no value.");
}
// 3rd request
var options3 = {
"headers": {
"Authorization": "Bearer " + accessToken,
"Accept": "application/octet-stream",
},
}
var response3 = UrlFetchApp.fetch(download_url, options3);
var blob = response3.getBlob().setContentType("application/zip");
var csvs = Utilities.unzip(blob);
// Create a new Spreadsheet and put the CSV data to the 1st tab.
var ss = SpreadsheetApp.create("sample Spreadsheet");
csvs.forEach((b, i) => {
var ar = Utilities.parseCsv(b.getDataAsString());
var sheet = i == 0 ? ss.getSheets()[i] : ss.insertSheet("sample" + (i + 1));
sheet.getRange(1, 1, ar.length, ar[0].length).setValues(ar);
});
}
- I'm not sure whether the downloaded the gzip file has the correct mimeType. So I added the mimeType like
var blob = response3.getBlob().setContentType("application/x-gzip")
.
Note:
When this script is run, the flow of your showing question is done. But, I cannot test this API because I have no account. So, when an error occurs, please check each value and your access token again. And, please provide the error message. By this, I would like to confirm it.
I thought that the value of download_url
returned from the 1st request might be the same with the value of download_url
returned from 2nd request. But, I cannot test this API because I have no account. So, please check it, and when my understanding is correct, you can modify the above script.
This sample script creates a new Spreadsheet. But, if you want to put the CSV data to the existing Spreadsheet, please modify the above script.
Reference: