I'm writing a Node.js PUT
endpoint to allow users to upload large files. As I test the server via a cURL
command, I'm finding that the entire file is 'uploaded' before my Node.js request fires:
cURL command
cat ./resource.tif \
| curl \
--progress-bar \
-X PUT \
--data-binary @- \
-H "Content-Type: application/octet-stream" \
https://server.com/path/to/uploaded/resource.tif \
| cat
Testing, I know that https://server.com/path/to/uploaded/resource.tif
already exists. In my Node.js code I test for this and respond with 409
:
if (exists) {
const msg = 'Conflict. Upload path already exists'
res.writeHead(409, msg)
res.write(msg)
res.end()
return
}
I'm finding that the response is only sent after the entire file has been uploaded. But I'm not sure if the file is buffering on the client side (i.e. cURL
), or on the server side.
In any case... How do I configure cURL to pass the file stream to Node.js without buffering?
Other question/answers that I have seen - for example this one (use pipe for curl data) use the same approach as piping output of cat
, or something similar to the argument for --binary-data
. But this still results in the whole file processed before I see the conflict error.
Using mbuffer
, as mentioned in https://stackoverflow.com/a/48351812/3114742:
mbuffer \
-i ./myfile.tif \
-r 2M \
| curl \
--progress-bar \
--verbose \
-X PUT \
--data-binary @- \
-H "Content-Type: application/octet-stream" \
http://server.com/path/to/myfile.tif \
| cat
This clearly shows that cuRL
is only executing the request once the entire file contents have been read into memory on the local machine.