When trying to access ADLS directory with the following PySpark code in Apache Spark I get the error:
ValueError: root_directory must be an absolute path. Got abfss://root@adlspretbiukadlsdev.dfs.core.windows.net/RAW/LANDING/ instead.
Traceback (most recent call last):
File "/home/trusted-service-user/cluster-env/env/lib/python3.6/site-packages/great_expectations/core/usage_statistics/usage_statistics.py", line 262, in usage_statistics_wrapped_method
result = func(*args, **kwargs)
The code that gives the above error when I'm trying to access the directory is as follows:
data_context_config = DataContextConfig(
datasources={"my_spark_datasource": my_spark_datasource_config},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory='abfss://root@adlspretbiukadlsdev.dfs.core.windows.net/RAW/LANDING/'),
)
context = BaseDataContext(project_config=data_context_config)
When I change the code to
data_context_config = DataContextConfig(
datasources={"my_spark_datasource": my_spark_datasource_config},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory='/abfss://root@adlspretbiukadlsdev.dfs.core.windows.net/RAW/LANDING/'),
)
I get the following error message:
PermissionError: [Errno 13] Permission denied: '/abfss:'
Traceback (most recent call last):
When I enter the following code
data_context_config = DataContextConfig(
datasources={"my_spark_datasource": my_spark_datasource_config},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory='/'),
)
context = BaseDataContext(project_config=data_context_config)
I get the error message:
PermissionError: [Errno 13] Permission denied: '/expectations'
Traceback (most recent call last):
However, I don't have a directory called '/expectations
As a side note I'm trying to execute Great_Expectations.