0

I'm currently trying to upload a file to AWS S3 using node-fetch in Node.js. On the first try, the operation is successfully done. But on the second try, the socket hangs up. The error is:

Error: FetchError: request to https://s3.ap-southeast-1.amazonaws.com/ failed, reason: socket hang up
    at ClientRequest.<anonymous> (C:\Users\project\node_modules\node-fetch\lib\index.js:1461:11)
    at ClientRequest.emit (events.js:323:22)
    at TLSSocket.socketOnEnd (_http_client.js:460:9)
    at TLSSocket.emit (events.js:323:22)
    at endReadableNT (_stream_readable.js:1204:12)
    at processTicksAndRejections (internal/process/task_queues.js:84:21)

The code:

  const readFile = await fs.promises.readFile(path);
  for (const [key, value] of Object.entries(credentials.params)) {
    form.append(key, value);
  }
  form.append("file", readFile);

  const upload = await fetch(credentials.endpoint_url, {
    method: "POST",
    body: form
  });

I'm guessing the problem is related to garbage collection or file descriptor stuff in the file, or the connection is not closed properly but I can't find any workaround. The function is executed within the express framework anyway. So basically when the endpoint is hit, it called the function and as I mentioned above, on the first try it successfully done but not on the second.

  • I don't think is related to file descriptors or GC, this is a very common use case, and the readFile method is non-blocking because Node uses threads to read the file in chunks, is not because the file is blocked, are you sure s3 doesn't have some sort of throttling in your case? – Sebastián Espinosa Apr 06 '21 at 07:47
  • @SebastiánEspinosa yup. Because I've tried the same thing using the same code in the browser and it works perfectly fine. – Dominique Altrez Apr 06 '21 at 08:15

0 Answers0