4

I am new to snowflake and I have create a external stage and a give my s3 bucket url. My task is to copy json file from s3 to snowflake table. When I run copy command I am gettingan error .

My command are as follows:

create or replace file format investor_fileformat
  type = 'json';
  
  create or replace stage investor_stage
  file_format = investor_fileformat url ='s3://emb-ingest-pit/extract/active/Test120220110/';
  
create table temp (id variant)

copy into temp from @investor_stage/Investor_20220114.json

Error: Failure using stage area. Cause: [Access Denied (Status Code: 403; Error Code: AccessDenied)].

When searched on internet it says I need "S3 bucket Integration" . I havent created one is this needed?

Thanks, Xi

Xi12
  • 939
  • 2
  • 14
  • 27

3 Answers3

3

You don’t appear to have set up secure access to s3. You’ll probably want to read the documentation

NickW
  • 8,430
  • 2
  • 6
  • 19
2

The s3 bucket integration is done using Storage integration in snowflake. Storage integration is used to authenticate/authorise the s3 access. Stage is created on top of a storage integration to access s3 locations that you want.

Read how to create storage integration here and access external storage: https://docs.snowflake.com/en/user-guide/data-load-s3-config-storage-integration.html

Do follow the above doc(it is quite exhaustive) and let me know if you face any challenges

Praneeth
  • 731
  • 3
  • 10
  • Hey @NickW the answer provided in an addition to yours, which explains storage integration/stage in simple terms and the doc provided is what i've used in the past. – Praneeth Jan 15 '22 at 11:50
0

I faced the same issue while copying the data from s3 to the snowflake table. After a lot of troubleshooting, I finally sorted it out. If you are using a personal s3 bucket for learning or training purposes this will be helpful.

Step 1:

The first goto >> s3 bucket >> permissions >> block public access >> uncheck Block all public access.

Try compiling the query. If your issue didn't resolve with step 1 then follow step 2 along with step 1.

The first goto >> s3 bucket >> permissions >> Bucket policy >> edit >> enter the below script and change the resource to your bucket name.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::YOUR_BUCKET_NAME/*"
        }
    ]
}

now scroll down to Cross-origin resource sharing (CORS) >> edit >>

[
    {
        "AllowedHeaders": [
            "Authorization",
            "Content-Length"
        ],
        "AllowedMethods": [
            "GET"
        ],
        "AllowedOrigins": [
            "*"
        ],
        "ExposeHeaders": [],
        "MaxAgeSeconds": 3000
    }
]

Try compiling the query. If your issue didn't resolve with steps 1 & 2 then follow step 3 along with steps 1 & 2.

Goto >> permissions >> Object Ownership >> click on ACL's Enabled >> save changes >> Goto permissions >> access control list >> edit >> give access to Everyone (public access) to list and read an s3 log delivery group.

These steps will help you resolve the issue.

Kranthi Kumar
  • 3,664
  • 1
  • 7
  • 9