0

I've created an s3 linked stage on snowflake called csv_stage with my aws credentials, and the creation was successful.

Now I'm trying to query the stage like below

select t.$1, t.$2 from @sandbox_ra.public.csv_stage/my_file.csv t

However the error I'm getting is

Failure using stage area. Cause: [The AWS Access Key Id you provided is not valid.]

Any idea why? Do I have to pass something in the query itself?

Thanks for your help!

Ultimately let's say my s3 location has 3 different csv files. I would like to load each one of them individually to different snowflake tables. What's the best way to go about doing this?

Digvijay S
  • 2,665
  • 1
  • 9
  • 21
Rex Aex
  • 63
  • 7
  • Have you provided access to files to Snowflake user ? – Digvijay S Sep 25 '20 at 08:18
  • Try this solution : https://aws.amazon.com/premiumsupport/knowledge-center/access-key-does-not-exist/#:~:text=The%20error%20message%20%22The%20AWS,user%20might%20have%20been%20deleted. – Digvijay S Sep 25 '20 at 08:19

3 Answers3

1

Regarding the last part of your question: You can load multiple files with one COPY INTO-command by using the file names or a certain regex-pattern. But as you have 3 different files for 3 different tables you also have to use three different COPY INTO-commands.

Regarding querying your stage you can find some more hints in these questions:

  1. Missing List-permissions on AWS - Snowflake - Failure using stage area. Cause: [The AWS Access Key Id you provided is not valid.] and
  2. https://community.snowflake.com/s/question/0D50Z00008EKjkpSAD/failure-using-stage-area-cause-access-denied-status-code-403-error-code-accessdeniedhow-to-resolve-this-error
  3. https://aws.amazon.com/de/premiumsupport/knowledge-center/access-key-does-not-exist/
Marcel
  • 2,454
  • 1
  • 5
  • 13
0

I found out the aws credential I provided was not right. After fixing that, query worked.

Rex Aex
  • 63
  • 7
0

This approach works to import data from S3 into a snowgflake Table from a public S3 bucket:

COPY INTO SNOW_SCHEMA.table_name  FROM  's3://test-public/new/solution/file.csv'
Dror
  • 5,107
  • 3
  • 27
  • 45