19

I am able to create an Amazon S3 signed URL for a bucket in my account from which I can download and upload via the Amazon AWS CLI.

I have created the Amazon S3 URL as follows:

from boto.s3.connection import S3Connection
key = os.environ['aws_access_key_id']
secret = os.environ['aws_secret_access_key']
c = S3Connection(key,secret,is_secure=False)
bucket = c.get_bucket('my-bucket')
bktkey = bucket.get_key('stuff.tar.gz')
seconds=60*60*12
url2 = bktkey.generate_url(expires_in=seconds)
print '%s' %url2

Using url2 and copy pasting it in Chrome, I get stuff.tar.gz downloaded.

But when I use Wget like so

wget <whatever is in url2>

I get the following exception,

HTTP request sent, awaiting response... 403 Forbidden
2015-12-04 20:48:57 ERROR 403: Forbidden.

Why is Wget failing where Chrome and Firefox are successful in downloading using the signed Amazon S3 URL?

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
mg03
  • 667
  • 2
  • 9
  • 14

1 Answers1

55

I had the same issue. The fix is to use quotes:

wget -O test.tar.gz "https://bucket.s3.amazonaws.com/app1/10/10.2/test.tar.gz?Signature=hgbdhjJHGK&Expires=1454446963&AWSAccessKeyId=AAAAAAAAA"
Undo
  • 25,519
  • 37
  • 106
  • 129
user3323536
  • 593
  • 4
  • 8
  • 3
    The issue is that some of the characters in the URL have special meaning in the shell, which is why it fails in the shell but not in bash. For example, '&' tails bash to put a command in the background. So bash would only parse the URL that far before truncating it. Adding quotes around the URL causes bash to treat the URL as a string not bash command line arguments. – Mark Stosberg May 30 '18 at 20:36
  • Thanks. I didn't expect the solution is as simple as adding quotes. I thought it is related to AWS protection or something similar. – pitfall Oct 23 '18 at 18:20
  • This is the second time I've come to this exact answer to find the solution. Thanks! – Joshua Pinter Sep 26 '19 at 02:09
  • Why is this not the accepted answer? – qwertynik Feb 02 '22 at 10:31