-3

Error im getting:

UnhandledPromiseRejectionWarning: FetchError: invalid json response body at {url}
reason: Unexpected token < in JSON at position 0

my code:

const fetch = require('node-fetch');

const url = 'Reeeealy long url here';

fetch(url)
  .then(res => res.json())
  .then(console.log);

thing is if url longer than ~8k+ characters api returns

400 Bad Request
Request Header Or Cookie Too Large
nginx

obviously i don't control that api.

what can I do to prevent that?

url structure:

1) domain

2) api version

3) endpoint

4) request stuff (longest part)

5) id at the end

look like this: https://example.com/v1/endpoint/query?query=long_part_here&ids=2145132,532532,535

Pleklo
  • 126
  • 1
  • 2
  • 10

1 Answers1

0

It sounds like a poorly designed api if it's expected that the 'long_part' is very long. Instead of a a GET request it should use POST so that the long set of data can be sent in a body object. Can you see if the API allows a POST version of the endpoint that allows that?

If no POST is available, and you don't control the API, you don't have a lot of options. The only thing I can think of is that you break your request into multiple separate endpoint calls (maybe one per id) if that is feasible and would result in shorter url sizes per request.

Multiple calls

If you're able to do multiple smaller requests, the code might look like this:

const urls = ["firstUrl","secondUrl","nthUrl"];
let combined = {};
for (const url of urls) {
  fetch(url)
    .then(res => res.json())
    .then(json => combined = {...combined, ...json};
}
console.log(combined);

This assumes that it's reasonable to merge the results all into one object. If they should be kept distinct, you could change the last then like this:

.then(json => combined = {...combined, {`url${count}`: json}};

where count is an integer that you increment each time and combined would look like

{url1: {/*json from url1*/}, url2: {/*json from url2*/}, ...}

Error Handling

To handle the error more gracefully you should check the result before assuming it's returned JSON. You got a JSON parse error because the data returned was not JSON. It was HTML so failed when it started with <. You could do something like this:

fetch(url)
  .then(res => {
    if (res.resultCode == "200") return res.json();
    return Promise.reject(`Bad call: ${res.resultCode}`);
  })
  .then(console.log);
Community
  • 1
  • 1
Always Learning
  • 5,510
  • 2
  • 17
  • 34
  • gonna check if I can do `POST` request. funniest part is that this api made by huge company but it worse api I had to deal with. Do you have any recommendations how to split request into `2` or `n` requests and combine it into 1 json? – Pleklo Nov 10 '19 at 09:21
  • I've edited to show how you might do multiple smaller calls – Always Learning Nov 10 '19 at 09:31
  • Oh, I meant more of how do I split original url into `n` urls if all i know it's can't be longer that ~8k characters. – Pleklo Nov 10 '19 at 09:34
  • to answer that I'd have to know details about the endpoint. I saw you had multiple id's at the end of the url. Would it be shorter if you just had one `id` per url and the data was limited to only the data for that `id`? – Always Learning Nov 10 '19 at 09:39
  • no query is always the same, but some times u want to get results with same query but for multi ids – Pleklo Nov 10 '19 at 09:40
  • so it's either 1 id or `n` ids but query always stays the same – Pleklo Nov 10 '19 at 09:41
  • if there is no way to shorten the query data then your only remaining choice is to contact the company and tell them their endpoint can't handle long queries and request that they implement a `POST` version that could do that (or any other fix they recommend) – Always Learning Nov 10 '19 at 09:42
  • if the API is public and documented (ie, something you are free to talk about the details of here on `stackoverflow`) then you could ask a new question specifically about how to call this endpoint with large amounts of data. Maybe others would know of alternate ways to do what you want. – Always Learning Nov 10 '19 at 09:45