0

I'm receiving this error invalid property 'auto_refresh' for 'different storage type from cloud provider' when creating an external table after following the guidelines in this snowflake doc Refreshing External Tables Automatically for Azure Blob Storage . Anyone has encountered this before? Any ideas on how to solve it? Thanks in advance

  • Please can you update your question with the create statements for the various Snowflake objects you've created - mask any confidential information – NickW May 10 '21 at 20:06
  • It would also be helpful to clarify whether your Snowflake account exists on Azure, as well. – Mike Walton May 12 '21 at 01:16

2 Answers2

0

I think the problem is that a notification channel is not set.

Run Show external tables and check the notification channel column

orellabac
  • 2,077
  • 2
  • 26
  • 34
0

Without knowing the cloud that is being used, I'll just add what I ran into with GCP backed snowflake instance in hopes that it'll help someone down the line. We allow: AWS, Azure, and GCP. Our standard external table creation properties looks like the following:

...
partition by (key1,key2,key3)
with location = @some_stage/key1=x/key2=y/key3=z
refresh_on_create = true
auto_refresh = true
file_format = (type = 'PARQUET')
pattern = '.*.parquet.gz';

Our first GCP table was attempted to be created with these, however we received the error mentioned in the question asked. Other instances (aws / azure) did not encounter this issue.

My resolution was to simply set auto_refresh = false in our default options. This was after sifting through snowflake official docs on auto_refresh in creating external tables (documentation on auto_refresh here). Since we are handling table refreshes differently and aren't hooking up integration objects or creating triggers - this was an appropriate change.

persinac
  • 66
  • 1
  • 7