0

I created a table including location such as: 

wasb://<container>@<storageaccount>.blob.core.windows.net/foldername 

We have updated access to storage accounts to use abfss 

I am trying to execute the following command: 

alter table mydatabase.mytable    
set location ' abfss://<container>@<storageaccount>.dfs.core.windows.net/foldername

I am getting the error:

Failure to initialize configuration for storage account <storageaccount>.dfs.core.windows.net: Invalid configuration value detected for fs.azure.account.keyInvalid configuration value detected for fs.azure.account.key

On the cluster I have changed:

spark.hadoop.fs.azure.account.key.<storageaccount>.blob.core.windows.net {{secrets/<proyect>/<storageaccount>}}

to this:

spark.hadoop.fs.azure.account.key.<storageaccount>.dfs.core.windows.net {{secrets/<proyect>/<storageaccount>}}
jalazbe
  • 1,801
  • 3
  • 19
  • 40

1 Answers1

0

The error you are facing is regarding to storage account key configuration. You set the key as below.

spark.conf.set("fs.azure.account.key.<adls_account_name>.dfs.core.windows.net","adls_account_key")

After setting that. Try altering the table.

But you can also follow bellow approach also using dbutils.

Below is the data i created.

CREATE  TABLE  user
(
id  INT,
name STRING,
age INT
)
USING  DELTA
LOCATION  'wasbs://container@<storage_acc_name>.blob.core.windows.net/user_data/';

enter image description here

Next, moving all files to adls account.

dbutils.fs.mv("wasbs://<container>@<storage_acc_name>.blob.core.windows.net/user_data/","abfss://container@<adls_storage_acc_name>.dfs.core.windows.net/user_data/",recurse=True)

Know, you can see records moved to adls account.

select  *  from  delta.`abfss://data@jadls2.dfs.core.windows.net/user_data/`;

enter image description here

JayashankarGS
  • 1,501
  • 2
  • 2
  • 6
  • In my case the directory has not change. The only change is the access. Would the dbutils move actually "move" data? If so I can't do this because it is a really large operation. – jalazbe Aug 17 '23 at 06:28
  • when you do alter on adls storage, destination location should be of delta format else you will get error as the location is not of delta format. Having you tried to do alter command after setting the key? – JayashankarGS Aug 17 '23 at 06:34
  • yes. dbutils moves all files to destination. – JayashankarGS Aug 17 '23 at 06:34
  • It is delta format. I set the key on the spark.config to run when cluster is created. I have tried to do alter statement after and is not working – jalazbe Aug 17 '23 at 06:54
  • ok. what is the error you getting? after setting the key and running alter command. – JayashankarGS Aug 17 '23 at 06:58
  • The one I included. I am talking to a colleague and there might be an issue with firewall configuration. – jalazbe Aug 17 '23 at 07:54