I'm unable to access one particular external location in Databricks on AWS using the databricks cli
databricks unity-catalog external-locations get --name <name>
returns
Error: Authorization failed. Your token may be expired or lack the valid scope
This external location is in an S3 bucket and my user has ALL PRIVILEGES for the storage credential
SHOW GRANTS `username` ON STORAGE CREDENTIAL `sc-name`;
From a databricks notebook I'm able to query the tables in the catalog using the external location
USE CATALOG `catalog-name`;
USE SCHEMA schema-name;
select * from table-name;
Using the databricks cli, I'm able to access other external locations in this workspace that are pointing to different buckets. These external locations have similar IAM role and policy associated with them, just with different S3 resource names. Has anyone had this problem before?