4

For small files, say < 5Mb, uploading to S3 via a presigned URL is working fine. When I try to upload anything larger than this then, I seem to get 400 Bad Request responses from S3 with RequestTimeout as the error.

I'm able to recreate this using curl:

# Create a test file:
mkfile -n 20m ./testfile

# Get the presigned URL
curl -XPOST http://localhost:3000/uploads -d "filename=testdata"

# Upload to S3
curl -H "Content-Type: binary/octet-stream" -XPUT -vvv -T testfile "<Presigned URL>"

Signing code looks like this:

@presigned_url ||= Aws::S3::Presigner.new.presigned_url(
  :put_object,
  bucket: ENV['AWS_BUCKET'],
  key: key,
  acl: 'private',
  expires_in: 3600,
  content_type: 'binary/octet-stream'
)

Note that this all works fine for files < 5Mb, with more frequent failures occurring as the file size increases. Has anyone seen these issues and resolved them?

Codebeef
  • 43,508
  • 23
  • 86
  • 119
  • This sounds like you have connectivity issues... packet loss. Are you running these tests from inside EC2 or elsewhere on the Internet? – Michael - sqlbot Oct 31 '17 at 22:48
  • Elsewhere, though no connection issues for anything else. – Codebeef Oct 31 '17 at 23:48
  • I've tested this from several geographic locations, and found the same problem – Codebeef Nov 01 '17 at 12:12
  • its a server problem, not related to any of the code you've shown here. could be your actual server, could be a proxy, but somewhere there is a max size or max upload time or max connection time rule being violated. can you reproduce it running `echo ok>ok.txt; php -S 0.0.0.0:9999 ok.txt` on the server and `curl -v -T testfile url:9999` on the client? also i think its a case for serverfault.com , not stackoverflow.com – hanshenrik Nov 02 '17 at 12:16
  • @hanshenrik the upload goes directly to s3 from the browser / client / curl. No intermediate server is involved. – Codebeef Nov 02 '17 at 12:30
  • contact the server admins (i guess this means going through s3 customer support?), im pretty sure its their fault – hanshenrik Nov 02 '17 at 12:54
  • Yeah, unfortunately, that's a paid service. This problem seems too broad to be unique to this situation, so asking here first if anyone else has experienced it – Codebeef Nov 02 '17 at 12:58
  • It can't be S3. I use boto3 to get a pre-signed URL and upload very large zip files to S3 using Javascript with no problem. Maybe you want to try using boto or the cli to verify the problem is not with the SDK. Also have you tried specifying the range for your upload using `content-length-range`? – Payman Nov 04 '17 at 18:30
  • S3 will send this error when it does not receive data from the client for 20 seconds. Does using curl with retries resolve the issue? Multi-part upload? Some people have reported similar issues which are resolved with sending headers before the payload using the `100-continue` `Expect` header http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html. Otherwise, it doesn't appear you're hitting any S3 limits. – smcstewart Nov 07 '17 at 20:38

0 Answers0