0

I have a stream of data and want to hit a certain endpoint I don't want to store the data locally to find the content length as I need to pass the content length as the headers property. So what are possible ways to get the content length?

(readStream) => {
const requestOptions = {
      url: 'https://url.com',
      method: 'PUT',
      headers: {
        'Content-Type': 'application/octet-stream',
        'Content-Length': length,
      },
      body: readStream
    }
  request(requestOptions, function (error, response, body) {
    
  });
}
  • 1
    I guess there’s three options: (1) figure out the stream size before streaming it. Eg file sizes may be found using fs.stat. (2) call on the stream source twice, and add up the stream size on the first pass (3) pass the file on to the endpoint as a binary file using transfer-encoding: chunked. This will allow you to omit the content length header. – jorgenkg Jul 30 '21 at 04:01
  • For my scenario, (1) I can't get the file size as I am only exposed to the read stream of that file (2) So in this scenario, we need to read the whole stream which might be bad in the case of big files say GB of data. (3) I tried that but getting errors from the endpoint like my endpoint is azure so I need to pass the content length i tried this solution before – Akshay Vashishtha Jul 30 '21 at 06:25
  • Assuming that you neither have control of the server (destination) implementation nor control of the data source or any means of determining the size of the stream, the options are either to (A) bounce the source via a temporary file [in the tmp dir](https://nodejs.org/api/fs.html#fs_fspromises_mkdtemp_prefix_options) or (B) request the source stream twice and spool & discard the incoming data on the first iteration. – jorgenkg Jul 30 '21 at 10:00

0 Answers0