2

I currently have a Powershell script to fetch orders through a REST API Call. The script (1) Uses the INVOKE-ResetMethod for the API call, then (2) uses the Export-CSV cmdlet to save the results in a CSV file, which works fine:

$uri = "https://api.SomeWebisite.com/api/orders?limit=1000"
$headers = @{
    'Content-Type' = 'application/json'
    'Authorization' = 'Bearer SomeTokenKey'
'Accept'= 'application/json'
}
  
Invoke-RestMethod -Uri $uri -Method GET -Headers $headers -Body $body | select -expand data     | select * | export-csv c:\extracts\orders.csv -notype -Force

The problem is, the provider has a 1000 record return limit, which doesn't work for us, as we need all orders in the dataset.

What would be the most efficient/appropriate way to retrieve the full recordset & and have all orders reside in a single CSV file?

Depth of Field
  • 307
  • 2
  • 16
  • 1
    Does the API support some kind of paging mechanism? For example, do they return the URL for the next lot of records in the response header? If so, you'd just need to loop around calling for each page, adding the data to a big collection, then dumping to CSV once you have it all. – boxdog Nov 19 '21 at 16:46
  • 3
    Depends entirely on the pagination semantics of that particular API. Some REST APIs are [standardized](https://datatracker.ietf.org/doc/html/rfc5988#page-6), while others might support offset-based pages (eg. the `$n`th request will have to be to `/api/orders?limit=1000&offset=$(($n - 1) * 1000)`), page numbers (the `$n`th request will have to be to `/api/orders?limit=1000&page=$n`), or perhaps cursor-based pagination (response includes an explicit continuation token, or you'll have to use the id or timestamp of the last record). You'll have to consult the documentation (or just try stuff!) – Mathias R. Jessen Nov 19 '21 at 16:52
  • 4
    PowerShell 7's `Invoke-RestMethod -FollowRelLink` will automatically fetch the whole set for you if the API supports relational links btw – Mathias R. Jessen Nov 19 '21 at 16:53

0 Answers0