0

I'm unable to access one particular external location in Databricks on AWS using the databricks cli

databricks unity-catalog external-locations get --name <name>

returns

Error: Authorization failed. Your token may be expired or lack the valid scope

This external location is in an S3 bucket and my user has ALL PRIVILEGES for the storage credential

SHOW GRANTS `username` ON STORAGE CREDENTIAL `sc-name`;

From a databricks notebook I'm able to query the tables in the catalog using the external location

USE CATALOG `catalog-name`;
USE SCHEMA schema-name;
select * from table-name;

Using the databricks cli, I'm able to access other external locations in this workspace that are pointing to different buckets. These external locations have similar IAM role and policy associated with them, just with different S3 resource names. Has anyone had this problem before?

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
gary69
  • 3,620
  • 6
  • 36
  • 50
  • Since it works for other external locations, then the issue is probably related to permissions. Check the permissions on the External Location in the Unity Catalog interface. – John Rotenstein May 10 '23 at 03:27
  • I don't see it listed in the Unity Catalog interface. My user is an admin user though – gary69 May 10 '23 at 13:47

0 Answers0