5

I've been trying for past couple of hours to setup a transfer from S3 to my google storage bucket.

The error that i keep getting, when creating the transfer is: "Invalid access key. Make sure the access key for your S3 bucket is correct, or set the bucket permissions to Grant Everyone."

Both the access key and the secret are correct, given that they are currently in use in production for S3 full access.

Couple of things to note:

  1. CORS-enabled on S3 bucket
  2. Bucket policy only allows authenticated AWS users to list/view its contents
  3. S3 requires signed URLs for access

Bucket Policy:

{
    "Version": "2008-10-17",
    "Id": "Policy234234234",
    "Statement": [
        {
            "Sid": "Stmt234234",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": [
                "s3:AbortMultipartUpload",
                "s3:GetObjectAcl",
                "s3:RestoreObject",
                "s3:GetObjectVersion",
                "s3:DeleteObject",
                "s3:DeleteObjectVersion",
                "s3:PutObjectVersionAcl",
                "s3:PutObjectAcl",
                "s3:GetObject",
                "s3:PutObject",
                "s3:GetObjectVersionAcl"
            ],
            "Resource": "arn:aws:s3:::mybucket/*"
        },
        {
            "Sid": "2",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::cloudfront:user/CloudFront Origin Access Identity xyzmatey"
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::mybucket/*"
        },
        {
            "Sid": "3",
            "Effect": "Allow",
            "Principal": {
                "AWS": "*"
            },
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": "arn:aws:s3:::mybucket"
        }
    ]
}

CORS Policy

<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
    <CORSRule>
        <AllowedOrigin>http://www.mywebsite.com</AllowedOrigin>
        <AllowedMethod>GET</AllowedMethod>
        <AllowedMethod>PUT</AllowedMethod>
        <AllowedMethod>DELETE</AllowedMethod>
        <MaxAgeSeconds>3000</MaxAgeSeconds>
        <AllowedHeader>*</AllowedHeader>
    </CORSRule>
    <CORSRule>
        <AllowedOrigin>*</AllowedOrigin>
        <AllowedMethod>GET</AllowedMethod>
        <AllowedHeader>AUTHORIZATION</AllowedHeader>
    </CORSRule>
    <CORSRule>
        <AllowedOrigin>*</AllowedOrigin>
        <AllowedMethod>HEAD</AllowedMethod>
        <AllowedHeader>AUTHORIZATION</AllowedHeader>
    </CORSRule>
</CORSConfiguration>

Any idea where i have gone wrong?

EDIT: I've setup the gsutil tool on a google compute instance and did a copy with the same AWS keys on the exact bucket. Worked like a charm..

Mysteryos
  • 5,581
  • 2
  • 32
  • 52
  • Your bucket policy doesn't include "s3:ListBucket". I am guessing that the transfer service might need that in order to get a list of objects to transfer. Try adding it to the list? Of course, that wouldn't explain how gsutil manages to copy the bucket, so that may be wrong. – Brandon Yarbrough Feb 05 '16 at 01:12
  • Hey brandon, i added the policy you mentioned above. Same results: Invalid key. Will see if i get a grasp on the support staff of google for this one. Thanks. – Mysteryos Feb 05 '16 at 06:30

5 Answers5

8

I'm one of the devs on Transfer Service.

You'll need to add "s3:GetBucketLocation" to your permissions.

It would be preferable if the error you received was more specifically about your ACLs, however, rather than an invalid key. I'll look into that.

EDIT: Adding more info to this post. There is documentation which lists this requirement: https://cloud.google.com/storage/transfer/

Here's a quote from the section on "Configuring Access":

"If your source data is an Amazon S3 bucket, then set up an AWS Identity and Access Management (IAM) user so that you give the user the ability to list the Amazon S3 bucket, get the location of the bucket, and read the objects in the bucket." [Emphasis mine.]

EDIT2: Much of the information provided in this answer could be useful for others, so it will remain here, but John's answer actually got to the bottom of OP's issue.

Jacob Criner
  • 134
  • 3
  • 1
    Hey jabob, added the `s3:GetBucketLocation` to the policy. Same error. Then i tried with the most permissive policy on that bucket & its objects: `Principal:"*"`, `Action:"*"`. The error stayed the same. IAM user has full access to S3, and AWS Policy simulator flagged green on all permissions for that bucket, for anonymous users & authenticated AWS user. Any insight? – Mysteryos Feb 06 '16 at 07:15
  • Further tests: Transfer setups correctly (with same access keys) when bucket location is in US or EU. It says invalid key when bucket location is in singapore, which is my region. Same permissive policies, & CORS rules were used in all tests. – Mysteryos Feb 06 '16 at 07:26
  • Hey Jacob, what's the best option for transferring from a specific S3 directory where I don't have bucket-level permissions to list, get the location of, and read all objects of the bucket? – Julian V. Modesto Jun 14 '16 at 20:55
  • @JulianV.Modesto My understanding is that you don't have bucket-level permissions to list, but DO have permissions to read all objects within a bucket. Is that correct? Assuming that is the case, the Transfer Service won't work for you. Listing permissions are required. You could potentially try our HTTP transfer service, which requires a provided list of URLs, but that would require listing the contents of the S3 bucket to obtain anyway. – Jacob Criner Jun 16 '16 at 16:54
  • @JacobCriner well I only have permissions to read and list objects within a user-specific S3 directory, so I feel like my options are super limited – Julian V. Modesto Jun 16 '16 at 17:30
  • @JulianV.Modesto Unfortunately, user-specific directories are not currently supported. Our service assumes you are the sole controller (or have admin privileges) over the bucket you're trying to transfer data from. If you only have a user-specific S3 directory, however, I'm willing to bet you don't have all that much data. We typically recommend our service for people transferring at least 1TB, preferably 10TB. gsutil works very well for transfers smaller than that, and can sync between an S3 source and GCS. See here for details: https://cloud.google.com/storage/docs/interoperability – Jacob Criner Jun 17 '16 at 23:04
  • @JulianV.Modesto TL;DR: Try gsutil for your transfer. The link above includes details you'll need. – Jacob Criner Jun 17 '16 at 23:12
  • @JacobCriner really appreciate your thoroughness! Getting Access Denied for gsutil ls-ing and rsync-ing on the user-specific S3 directory. Best way to submit a feature request? The data's about 20TB from a user-specific directory of an external partner's S3 bucket, and then transfer 1TB daily from them, so it seems like the Transfer Service would've been a great fit – Julian V. Modesto Jun 20 '16 at 16:12
  • @JulianV.Modesto The fact that user-specific directories are a fairly standard usage of S3 inclines me to think we need to improve our access-control model. I've filed a feature request internally to look into this. As for gsutil: you can make a feature request here: https://github.com/GoogleCloudPlatform/gsutil/issues. – Jacob Criner Jun 24 '16 at 04:21
  • @JacobCriner thanks!! Appreciate the internal feature request – Julian V. Modesto Jun 24 '16 at 15:24
6

I am an engineer on Transfer service. The reason you encountered this problem is that AWS S3 region ap-southeast-1 (Singapore) is not yet supported by the Transfer service, because GCP does not have networking arrangement with AWS S3 in that region. We can consider to support that region now but your transfer will be much slower than other regions.

On our end, we are making a fix to display a clearer error message.

John
  • 96
  • 1
3

You can also get the 'Invalid access key' error if you try to transfer a subdirectory rather than a root S3 bucket. For example, I tried to transfer s3://my-bucket/my-subdirectory and it kept failing with the invalid access key error, despite me giving read permissions to google for the entire S3 bucket. It turns out the google transfer service doesn't support transferring subdirectories of the S3 bucket, you must specify the root as the source for the transfer: s3://my-bucket.

Josh
  • 697
  • 6
  • 21
  • Josh, does gsutil work with s3 subdirectories? If not, any other workaround? – user3688176 May 16 '16 at 19:24
  • I'm not sure about gsutil, but with the transfer service it's possible to set a parameter called 'includePrefixes' in the transfer spec. This can be set to a list of prefixes (which can be subdirectories), and prevents transfer service from copying other subdirectories. You still have to specify the root bucket as the source though! – Josh May 16 '16 at 19:50
  • This is still the case - this is terrible UX – the1dv Jun 07 '18 at 03:39
  • This is so bad. Even in 2020. I have a some files on S3 in a sub-directory. I don't have access to root directory and even with prefixes it doesn't work since it needs access to root. – Behroz Sikander Jun 02 '20 at 08:38
3

May Be this can help:

First, specify the S3_host in you boto config file, i.e., the endpoint-url containing region (No need to specify s3-host if the region is us-east-1, which is default). eg,

vi ~/.boto

s3_host = s3-us-west-1.amazonaws.com

That is it, Now you can proceed with any one of these commands:

gsutil -m cp -r s3://bucket-name/folder-name gs://Bucket/

gsutil -m cp -r s3://bucket-name/folder-name/specific-file-name gs://Bucket/

gsutil -m cp -r s3://bucket-name/folder-name/ gs://Bucket/*

gsutil -m cp -r s3://bucket-name/folder-name/file-name-Prefix gs://Bucket/**

You can also try rsync.

https://cloud.google.com/storage/docs/gsutil/commands/rsync

0

I encountered the same problem couple of minutes ago. And I was easily able to solve it by giving admin access key and secret key.

It worked for me. just FYI, my s3 bucket was north-Virginia.

Omisha gupta
  • 121
  • 1
  • 6